[ 520.459961] env[67119]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 521.103259] env[67169]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 522.440017] env[67169]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=67169) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 522.440458] env[67169]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=67169) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 522.440458] env[67169]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=67169) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 522.440782] env[67169]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 522.637985] env[67169]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=67169) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 522.648422] env[67169]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=67169) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 522.751086] env[67169]: INFO nova.virt.driver [None req-3649311e-3b67-4bbf-8f05-5dac89d576f2 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 522.822292] env[67169]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 522.822467] env[67169]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 522.822571] env[67169]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=67169) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 525.663465] env[67169]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-4725a31c-50aa-4459-8246-0728013e4250 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.680162] env[67169]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=67169) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 525.680355] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-e192ce57-1e26-40ed-8447-fd3a1df07099 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.704936] env[67169]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 8fca5. [ 525.705113] env[67169]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.883s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 525.705580] env[67169]: INFO nova.virt.vmwareapi.driver [None req-3649311e-3b67-4bbf-8f05-5dac89d576f2 None None] VMware vCenter version: 7.0.3 [ 525.709051] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9204b35-b480-496f-a9f3-87833bd56134 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.726971] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8773a0d-a3c1-44be-bf6c-308905bfa6b2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.733241] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eda2fe0-b6f0-4565-a951-e56203999da4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.740122] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da81a888-7747-4c21-9ee5-0481c8ec6e20 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.753302] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4efd19a-68d7-4862-a696-11e69eb2eae7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.759264] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e59562a5-b409-49e5-93ac-f830559d63b6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.789857] env[67169]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-cc460bf1-475e-4852-bc73-48f315c71bd8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.795071] env[67169]: DEBUG nova.virt.vmwareapi.driver [None req-3649311e-3b67-4bbf-8f05-5dac89d576f2 None None] Extension org.openstack.compute already exists. {{(pid=67169) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 525.797748] env[67169]: INFO nova.compute.provider_config [None req-3649311e-3b67-4bbf-8f05-5dac89d576f2 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 525.816707] env[67169]: DEBUG nova.context [None req-3649311e-3b67-4bbf-8f05-5dac89d576f2 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),fb90fbcb-30a5-4d02-96ca-af8316800bdb(cell1) {{(pid=67169) load_cells /opt/stack/nova/nova/context.py:464}} [ 525.818690] env[67169]: DEBUG oslo_concurrency.lockutils [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 525.818914] env[67169]: DEBUG oslo_concurrency.lockutils [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 525.819663] env[67169]: DEBUG oslo_concurrency.lockutils [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 525.820091] env[67169]: DEBUG oslo_concurrency.lockutils [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Acquiring lock "fb90fbcb-30a5-4d02-96ca-af8316800bdb" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 525.820285] env[67169]: DEBUG oslo_concurrency.lockutils [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Lock "fb90fbcb-30a5-4d02-96ca-af8316800bdb" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 525.821223] env[67169]: DEBUG oslo_concurrency.lockutils [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Lock "fb90fbcb-30a5-4d02-96ca-af8316800bdb" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 525.845630] env[67169]: INFO dbcounter [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Registered counter for database nova_cell0 [ 525.853848] env[67169]: INFO dbcounter [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Registered counter for database nova_cell1 [ 525.856785] env[67169]: DEBUG oslo_db.sqlalchemy.engines [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67169) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 525.857150] env[67169]: DEBUG oslo_db.sqlalchemy.engines [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67169) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 525.861529] env[67169]: DEBUG dbcounter [-] [67169] Writer thread running {{(pid=67169) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 525.862253] env[67169]: DEBUG dbcounter [-] [67169] Writer thread running {{(pid=67169) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 525.864506] env[67169]: ERROR nova.db.main.api [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 525.864506] env[67169]: result = function(*args, **kwargs) [ 525.864506] env[67169]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 525.864506] env[67169]: return func(*args, **kwargs) [ 525.864506] env[67169]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 525.864506] env[67169]: result = fn(*args, **kwargs) [ 525.864506] env[67169]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 525.864506] env[67169]: return f(*args, **kwargs) [ 525.864506] env[67169]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 525.864506] env[67169]: return db.service_get_minimum_version(context, binaries) [ 525.864506] env[67169]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 525.864506] env[67169]: _check_db_access() [ 525.864506] env[67169]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 525.864506] env[67169]: stacktrace = ''.join(traceback.format_stack()) [ 525.864506] env[67169]: [ 525.865270] env[67169]: ERROR nova.db.main.api [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 525.865270] env[67169]: result = function(*args, **kwargs) [ 525.865270] env[67169]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 525.865270] env[67169]: return func(*args, **kwargs) [ 525.865270] env[67169]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 525.865270] env[67169]: result = fn(*args, **kwargs) [ 525.865270] env[67169]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 525.865270] env[67169]: return f(*args, **kwargs) [ 525.865270] env[67169]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 525.865270] env[67169]: return db.service_get_minimum_version(context, binaries) [ 525.865270] env[67169]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 525.865270] env[67169]: _check_db_access() [ 525.865270] env[67169]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 525.865270] env[67169]: stacktrace = ''.join(traceback.format_stack()) [ 525.865270] env[67169]: [ 525.865845] env[67169]: WARNING nova.objects.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 525.865845] env[67169]: WARNING nova.objects.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Failed to get minimum service version for cell fb90fbcb-30a5-4d02-96ca-af8316800bdb [ 525.866217] env[67169]: DEBUG oslo_concurrency.lockutils [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Acquiring lock "singleton_lock" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 525.866383] env[67169]: DEBUG oslo_concurrency.lockutils [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Acquired lock "singleton_lock" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 525.866627] env[67169]: DEBUG oslo_concurrency.lockutils [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Releasing lock "singleton_lock" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 525.866986] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Full set of CONF: {{(pid=67169) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 525.867151] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ******************************************************************************** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 525.867282] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] Configuration options gathered from: {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 525.867416] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 525.867603] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 525.867732] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ================================================================================ {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 525.867940] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] allow_resize_to_same_host = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.868145] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] arq_binding_timeout = 300 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.868284] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] backdoor_port = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.868411] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] backdoor_socket = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.868577] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] block_device_allocate_retries = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.868747] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] block_device_allocate_retries_interval = 3 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.868921] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cert = self.pem {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.869105] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.869279] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute_monitors = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.869448] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] config_dir = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.869617] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] config_drive_format = iso9660 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.869749] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.869914] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] config_source = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.870092] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] console_host = devstack {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.870258] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] control_exchange = nova {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.870415] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cpu_allocation_ratio = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.870576] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] daemon = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.870738] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] debug = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.870895] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] default_access_ip_network_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.871069] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] default_availability_zone = nova {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.871227] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] default_ephemeral_format = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.871384] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] default_green_pool_size = 1000 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.871620] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.871786] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] default_schedule_zone = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.871946] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] disk_allocation_ratio = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.872124] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] enable_new_services = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.872298] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] enabled_apis = ['osapi_compute'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.872462] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] enabled_ssl_apis = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.872620] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] flat_injected = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.872779] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] force_config_drive = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.872937] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] force_raw_images = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.873118] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] graceful_shutdown_timeout = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.873282] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] heal_instance_info_cache_interval = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.873492] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] host = cpu-1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.873662] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.873826] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] initial_disk_allocation_ratio = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.873985] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] initial_ram_allocation_ratio = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.874207] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.874370] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] instance_build_timeout = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.874526] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] instance_delete_interval = 300 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.874691] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] instance_format = [instance: %(uuid)s] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.874856] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] instance_name_template = instance-%08x {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.875027] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] instance_usage_audit = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.875200] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] instance_usage_audit_period = month {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.875367] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.875530] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] instances_path = /opt/stack/data/nova/instances {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.875709] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] internal_service_availability_zone = internal {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.875882] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] key = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.876053] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] live_migration_retry_count = 30 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.876220] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] log_config_append = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.876385] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.876542] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] log_dir = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.876717] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] log_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.876855] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] log_options = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.877028] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] log_rotate_interval = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.877201] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] log_rotate_interval_type = days {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.877369] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] log_rotation_type = none {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.877498] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.877623] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.877789] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.877956] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.878117] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.878289] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] long_rpc_timeout = 1800 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.878450] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] max_concurrent_builds = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.878606] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] max_concurrent_live_migrations = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.878762] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] max_concurrent_snapshots = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.878919] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] max_local_block_devices = 3 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.879084] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] max_logfile_count = 30 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.879245] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] max_logfile_size_mb = 200 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.879403] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] maximum_instance_delete_attempts = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.879568] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] metadata_listen = 0.0.0.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.879734] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] metadata_listen_port = 8775 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.879902] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] metadata_workers = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.880070] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] migrate_max_retries = -1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.880243] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] mkisofs_cmd = genisoimage {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.880447] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] my_block_storage_ip = 10.180.1.21 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.880578] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] my_ip = 10.180.1.21 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.880741] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] network_allocate_retries = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.880923] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.881099] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] osapi_compute_listen = 0.0.0.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.881263] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] osapi_compute_listen_port = 8774 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.881427] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] osapi_compute_unique_server_name_scope = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.881591] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] osapi_compute_workers = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.881748] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] password_length = 12 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.881912] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] periodic_enable = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.882077] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] periodic_fuzzy_delay = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.882244] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] pointer_model = usbtablet {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.882408] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] preallocate_images = none {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.882567] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] publish_errors = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.882696] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] pybasedir = /opt/stack/nova {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.882853] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ram_allocation_ratio = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.883015] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] rate_limit_burst = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.883185] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] rate_limit_except_level = CRITICAL {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.883342] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] rate_limit_interval = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.883504] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] reboot_timeout = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.883656] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] reclaim_instance_interval = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.883808] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] record = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.883975] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] reimage_timeout_per_gb = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.884151] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] report_interval = 120 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.884311] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] rescue_timeout = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.884471] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] reserved_host_cpus = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.884627] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] reserved_host_disk_mb = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.884784] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] reserved_host_memory_mb = 512 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.884949] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] reserved_huge_pages = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.885127] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] resize_confirm_window = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.885285] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] resize_fs_using_block_device = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.885443] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] resume_guests_state_on_host_boot = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.885613] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.885803] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] rpc_response_timeout = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.885967] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] run_external_periodic_tasks = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.886203] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] running_deleted_instance_action = reap {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.886384] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] running_deleted_instance_poll_interval = 1800 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.886549] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] running_deleted_instance_timeout = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.886711] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler_instance_sync_interval = 120 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.886883] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_down_time = 720 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.887063] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] servicegroup_driver = db {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.887230] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] shelved_offload_time = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.887389] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] shelved_poll_interval = 3600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.887554] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] shutdown_timeout = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.887717] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] source_is_ipv6 = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.887875] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ssl_only = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.888155] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.888331] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] sync_power_state_interval = 600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.888491] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] sync_power_state_pool_size = 1000 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.888660] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] syslog_log_facility = LOG_USER {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.888819] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] tempdir = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.888979] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] timeout_nbd = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.889166] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] transport_url = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.889330] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] update_resources_interval = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.889485] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] use_cow_images = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.889641] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] use_eventlog = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.889795] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] use_journal = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.889952] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] use_json = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.890119] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] use_rootwrap_daemon = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.890275] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] use_stderr = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.890430] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] use_syslog = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.890580] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vcpu_pin_set = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.890743] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plugging_is_fatal = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.890906] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plugging_timeout = 300 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.891102] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] virt_mkfs = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.891270] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] volume_usage_poll_interval = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.891429] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] watch_log_file = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.891591] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] web = /usr/share/spice-html5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 525.891773] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_concurrency.disable_process_locking = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.892075] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.892261] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.892427] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.892595] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.892765] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.892930] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.893120] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.auth_strategy = keystone {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.893288] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.compute_link_prefix = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.893462] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.893633] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.dhcp_domain = novalocal {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.893799] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.enable_instance_password = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.893961] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.glance_link_prefix = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.894138] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.894314] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.894476] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.instance_list_per_project_cells = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.894635] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.list_records_by_skipping_down_cells = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.894795] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.local_metadata_per_cell = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.894962] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.max_limit = 1000 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.895138] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.metadata_cache_expiration = 15 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.895310] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.neutron_default_tenant_id = default {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.895475] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.use_forwarded_for = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.895638] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.use_neutron_default_nets = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.895840] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.896017] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.896185] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.896358] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.896533] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.vendordata_dynamic_targets = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.896717] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.vendordata_jsonfile_path = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.896919] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.897128] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.backend = dogpile.cache.memcached {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.897300] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.backend_argument = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.897474] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.config_prefix = cache.oslo {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.897641] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.dead_timeout = 60.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.897804] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.debug_cache_backend = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.897966] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.enable_retry_client = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.898169] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.enable_socket_keepalive = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.898348] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.enabled = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.898514] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.expiration_time = 600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.898678] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.hashclient_retry_attempts = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.898846] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.hashclient_retry_delay = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.899012] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_dead_retry = 300 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.899188] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_password = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.899352] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.899512] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.899673] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_pool_maxsize = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.899831] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.899992] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_sasl_enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.900181] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.900348] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_socket_timeout = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.900516] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.memcache_username = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.900681] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.proxies = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.900845] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.retry_attempts = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.901024] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.retry_delay = 0.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.901182] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.socket_keepalive_count = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.901344] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.socket_keepalive_idle = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.901503] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.socket_keepalive_interval = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.901659] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.tls_allowed_ciphers = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.901814] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.tls_cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.901971] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.tls_certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.902143] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.tls_enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.902299] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cache.tls_keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.902465] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.auth_section = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.902637] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.auth_type = password {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.902799] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.902974] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.catalog_info = volumev3::publicURL {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.903145] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.903308] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.903469] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.cross_az_attach = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.903629] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.debug = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.903786] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.endpoint_template = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.903948] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.http_retries = 3 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.904124] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.904284] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.904456] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.os_region_name = RegionOne {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.904621] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.904782] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cinder.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.904956] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.905129] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.cpu_dedicated_set = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.905291] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.cpu_shared_set = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.905461] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.image_type_exclude_list = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.905658] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.905855] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.max_concurrent_disk_ops = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.906037] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.max_disk_devices_to_attach = -1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.906208] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.906381] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.906547] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.resource_provider_association_refresh = 300 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.906713] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.shutdown_retry_interval = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.906899] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.907085] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] conductor.workers = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.907265] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] console.allowed_origins = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.907427] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] console.ssl_ciphers = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.907595] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] console.ssl_minimum_version = default {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.907768] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] consoleauth.token_ttl = 600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.907942] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.908135] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.908314] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.908480] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.connect_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.908641] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.connect_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.908801] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.endpoint_override = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.908967] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.909143] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.909305] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.max_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.909464] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.min_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.909623] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.region_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.909780] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.service_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.909952] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.service_type = accelerator {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.910129] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.910288] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.status_code_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.910445] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.status_code_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.910602] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.910784] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.910947] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] cyborg.version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.911145] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.backend = sqlalchemy {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.911326] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.connection = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.911496] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.connection_debug = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.911669] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.connection_parameters = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.911835] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.connection_recycle_time = 3600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.912013] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.connection_trace = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.912181] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.db_inc_retry_interval = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.912347] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.db_max_retries = 20 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.912509] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.db_max_retry_interval = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.912673] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.db_retry_interval = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.912875] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.max_overflow = 50 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.913111] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.max_pool_size = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.913277] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.max_retries = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.914387] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.914387] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.mysql_wsrep_sync_wait = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.914387] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.pool_timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.914387] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.retry_interval = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.914387] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.slave_connection = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.914576] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.sqlite_synchronous = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.914707] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] database.use_db_reconnect = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.914869] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.backend = sqlalchemy {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.915108] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.connection = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.915308] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.connection_debug = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.915490] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.connection_parameters = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.915659] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.connection_recycle_time = 3600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.915857] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.connection_trace = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.916042] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.db_inc_retry_interval = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.916216] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.db_max_retries = 20 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.916382] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.db_max_retry_interval = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.916546] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.db_retry_interval = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.916716] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.max_overflow = 50 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.916883] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.max_pool_size = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.917062] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.max_retries = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.917238] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.917398] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.917560] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.pool_timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.917729] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.retry_interval = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.917889] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.slave_connection = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.918070] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] api_database.sqlite_synchronous = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.918246] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] devices.enabled_mdev_types = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.918425] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.918591] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ephemeral_storage_encryption.enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.918762] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.918935] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.api_servers = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.919113] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.919278] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.919440] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.919601] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.connect_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.919761] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.connect_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.919923] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.debug = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.920099] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.default_trusted_certificate_ids = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.920264] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.enable_certificate_validation = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.920425] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.enable_rbd_download = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.920583] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.endpoint_override = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.920748] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.920910] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.921080] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.max_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.921244] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.min_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.921408] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.num_retries = 3 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.921578] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.rbd_ceph_conf = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.921743] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.rbd_connect_timeout = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.921914] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.rbd_pool = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.922093] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.rbd_user = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.922257] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.region_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.922416] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.service_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.922582] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.service_type = image {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.922744] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.922905] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.status_code_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.923070] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.status_code_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.923232] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.923412] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.923578] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.verify_glance_signatures = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.923737] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] glance.version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.923904] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] guestfs.debug = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.924084] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.config_drive_cdrom = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.924249] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.config_drive_inject_password = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.924421] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.924582] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.enable_instance_metrics_collection = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.924746] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.enable_remotefx = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.924917] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.instances_path_share = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.925092] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.iscsi_initiator_list = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.925259] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.limit_cpu_features = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.925422] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.925584] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.925754] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.power_state_check_timeframe = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.925920] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.926100] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.926266] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.use_multipath_io = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.926431] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.volume_attach_retry_count = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.926593] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.926773] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.vswitch_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.926948] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.927132] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] mks.enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.927486] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.927680] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] image_cache.manager_interval = 2400 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.927862] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] image_cache.precache_concurrency = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.928047] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] image_cache.remove_unused_base_images = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.928281] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.928386] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.928564] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] image_cache.subdirectory_name = _base {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.928742] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.api_max_retries = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.928912] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.api_retry_interval = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.929094] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.auth_section = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.929277] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.auth_type = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.929441] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.929603] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.929771] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.929939] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.conductor_group = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.930113] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.connect_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.930276] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.connect_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.930438] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.endpoint_override = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.930625] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.930804] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.930970] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.max_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.931142] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.min_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.931310] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.peer_list = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.931470] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.region_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.931634] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.serial_console_state_timeout = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.931794] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.service_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.931965] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.service_type = baremetal {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.932142] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.932305] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.status_code_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.932462] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.status_code_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.932623] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.932806] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.932971] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ironic.version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.933168] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.933344] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] key_manager.fixed_key = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.933526] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.933691] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.barbican_api_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.933854] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.barbican_endpoint = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.934038] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.barbican_endpoint_type = public {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.934203] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.barbican_region_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.934363] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.934522] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.934687] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.934851] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.935017] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.935182] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.number_of_retries = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.935345] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.retry_delay = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.935507] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.send_service_user_token = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.935672] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.935855] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.936035] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.verify_ssl = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.936200] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican.verify_ssl_path = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.936368] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican_service_user.auth_section = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.936532] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican_service_user.auth_type = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.936708] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican_service_user.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.936916] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican_service_user.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.937109] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican_service_user.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.937276] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican_service_user.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.937437] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican_service_user.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.937602] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican_service_user.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.937763] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] barbican_service_user.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.937936] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.approle_role_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.938144] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.approle_secret_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.938377] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.938600] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945153] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945153] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945153] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945153] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.kv_mountpoint = secret {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945153] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.kv_path = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945153] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.kv_version = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945583] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.namespace = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945583] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.root_token_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945583] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945583] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.ssl_ca_crt_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945583] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945583] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.use_ssl = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945890] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945890] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.auth_section = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945890] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.auth_type = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945890] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945890] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.945890] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.946187] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.connect_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.946187] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.connect_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.946187] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.endpoint_override = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.946187] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.946187] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.946187] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.max_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.946481] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.min_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.946481] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.region_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.947550] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.service_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.947550] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.service_type = identity {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.947550] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.947550] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.status_code_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.947981] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.status_code_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.947981] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.948178] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.948329] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] keystone.version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.948549] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.connection_uri = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.948752] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.cpu_mode = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.948949] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.cpu_model_extra_flags = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.949180] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.cpu_models = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.949369] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.cpu_power_governor_high = performance {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.949519] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.cpu_power_governor_low = powersave {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.949695] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.cpu_power_management = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.949881] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.950079] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.device_detach_attempts = 8 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.950257] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.device_detach_timeout = 20 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.950439] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.disk_cachemodes = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.950610] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.disk_prefix = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.950802] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.enabled_perf_events = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.950980] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.file_backed_memory = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.951172] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.gid_maps = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.951345] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.hw_disk_discard = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.951513] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.hw_machine_type = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.951694] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.images_rbd_ceph_conf = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.951872] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.952072] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.952252] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.images_rbd_glance_store_name = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.952427] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.images_rbd_pool = rbd {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.952602] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.images_type = default {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.952767] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.images_volume_group = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.952937] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.inject_key = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.953118] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.inject_partition = -2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.953287] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.inject_password = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.953456] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.iscsi_iface = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.953646] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.iser_use_multipath = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.953858] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_bandwidth = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.954200] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.954267] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_downtime = 500 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.954410] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.954583] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.954775] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_inbound_addr = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.954961] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.955168] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_permit_post_copy = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.955334] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_scheme = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.955508] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_timeout_action = abort {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.956082] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_tunnelled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.956082] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_uri = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.956082] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.live_migration_with_native_tls = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.956233] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.max_queues = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.956357] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.956520] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.nfs_mount_options = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.956882] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.957077] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.957250] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.num_iser_scan_tries = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.957415] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.num_memory_encrypted_guests = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.957581] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.957748] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.num_pcie_ports = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.957921] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.num_volume_scan_tries = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.958112] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.pmem_namespaces = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.958281] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.quobyte_client_cfg = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.958585] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.958761] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rbd_connect_timeout = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.958930] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.959112] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.959279] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rbd_secret_uuid = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.959440] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rbd_user = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.959606] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.959782] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.remote_filesystem_transport = ssh {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.959946] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rescue_image_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.960119] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rescue_kernel_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.960283] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rescue_ramdisk_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.960451] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.960612] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.rx_queue_size = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.960780] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.smbfs_mount_options = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.961063] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.961240] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.snapshot_compression = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.961403] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.snapshot_image_format = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.961621] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.961788] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.sparse_logical_volumes = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.962302] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.swtpm_enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.962302] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.swtpm_group = tss {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.962302] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.swtpm_user = tss {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.962456] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.sysinfo_serial = unique {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.962615] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.tb_cache_size = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.962773] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.tx_queue_size = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.962941] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.uid_maps = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.963115] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.use_virtio_for_bridges = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.963296] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.virt_type = kvm {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.963509] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.volume_clear = zero {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.963628] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.volume_clear_size = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.963824] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.volume_use_multipath = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.963995] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.vzstorage_cache_path = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.964183] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.964353] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.vzstorage_mount_group = qemu {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.964520] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.vzstorage_mount_opts = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.964689] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.964968] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.965160] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.vzstorage_mount_user = stack {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.965331] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.965505] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.auth_section = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.965684] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.auth_type = password {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.965865] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.966041] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.966209] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.966369] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.connect_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.966526] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.connect_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.966697] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.default_floating_pool = public {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.966856] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.endpoint_override = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.967025] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.extension_sync_interval = 600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.967191] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.http_retries = 3 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.967349] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.967506] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.967665] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.max_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.967835] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.967998] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.min_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.968180] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.ovs_bridge = br-int {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.968347] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.physnets = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.968516] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.region_name = RegionOne {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.968680] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.service_metadata_proxy = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.968839] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.service_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.969025] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.service_type = network {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.969187] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.969347] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.status_code_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.969504] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.status_code_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.969661] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.969838] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.970007] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] neutron.version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.970189] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] notifications.bdms_in_notifications = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.970367] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] notifications.default_level = INFO {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.970543] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] notifications.notification_format = unversioned {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.970706] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] notifications.notify_on_state_change = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.970881] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.971070] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] pci.alias = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.971245] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] pci.device_spec = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.971411] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] pci.report_in_placement = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.971584] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.auth_section = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.971757] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.auth_type = password {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.971928] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.972100] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.972263] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.972427] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.972587] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.connect_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.972748] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.connect_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.972909] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.default_domain_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.973079] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.default_domain_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.973243] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.domain_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.973404] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.domain_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.973562] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.endpoint_override = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.973748] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.973917] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.974087] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.max_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.974249] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.min_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.974417] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.password = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.974576] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.project_domain_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.974742] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.project_domain_name = Default {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.974912] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.project_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.975095] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.project_name = service {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.975269] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.region_name = RegionOne {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.975431] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.service_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.975599] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.service_type = placement {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.975796] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.975963] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.status_code_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.976140] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.status_code_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.976301] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.system_scope = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.976462] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.976624] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.trust_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.976802] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.user_domain_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.976987] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.user_domain_name = Default {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.977165] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.user_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.977339] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.username = placement {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.977521] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.977683] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] placement.version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.977862] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.cores = 20 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.978036] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.count_usage_from_placement = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.978212] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.978381] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.injected_file_content_bytes = 10240 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.978546] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.injected_file_path_length = 255 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.978712] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.injected_files = 5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.978882] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.instances = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.979058] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.key_pairs = 100 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.979231] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.metadata_items = 128 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.979396] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.ram = 51200 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.979559] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.recheck_quota = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.979726] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.server_group_members = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.979896] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] quota.server_groups = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.980075] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] rdp.enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.980400] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.980586] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.980757] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.980932] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.image_metadata_prefilter = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.981108] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.981277] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.max_attempts = 3 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.981442] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.max_placement_results = 1000 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.981608] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.981772] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.query_placement_for_image_type_support = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.981937] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.982124] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] scheduler.workers = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.982301] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.982480] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.982664] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.982841] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.983016] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.983188] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.983355] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.983546] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.983738] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.host_subset_size = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.983925] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.984101] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.984273] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.984441] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.isolated_hosts = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.984607] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.isolated_images = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.984772] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.984938] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.985117] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.985284] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.pci_in_placement = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.985449] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.985611] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.985809] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.985981] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.986165] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.986332] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.986499] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.track_instance_changes = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.986679] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.986877] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] metrics.required = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.987062] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] metrics.weight_multiplier = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.987234] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.987678] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] metrics.weight_setting = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.987712] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.987864] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] serial_console.enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.988056] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] serial_console.port_range = 10000:20000 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.988233] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.988402] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.988573] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] serial_console.serialproxy_port = 6083 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.988742] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.auth_section = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.988917] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.auth_type = password {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.989091] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.989253] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.989418] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.989581] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.989740] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.989924] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.send_service_user_token = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.990101] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.990264] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] service_user.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.990434] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.agent_enabled = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.990596] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.990894] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.991099] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.991274] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.html5proxy_port = 6082 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.991437] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.image_compression = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.991601] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.jpeg_compression = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.991762] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.playback_compression = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.991932] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.server_listen = 127.0.0.1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.992117] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.992281] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.streaming_mode = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.992443] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] spice.zlib_compression = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.992612] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] upgrade_levels.baseapi = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.992775] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] upgrade_levels.cert = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.992948] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] upgrade_levels.compute = auto {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.993121] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] upgrade_levels.conductor = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.993284] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] upgrade_levels.scheduler = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.993450] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vendordata_dynamic_auth.auth_section = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.993611] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vendordata_dynamic_auth.auth_type = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.993802] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vendordata_dynamic_auth.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.993971] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vendordata_dynamic_auth.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.994152] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.994318] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vendordata_dynamic_auth.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.994480] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vendordata_dynamic_auth.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.994644] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.994806] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vendordata_dynamic_auth.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.994983] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.api_retry_count = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.995159] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.ca_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.995332] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.cache_prefix = devstack-image-cache {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.995500] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.cluster_name = testcl1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.995666] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.connection_pool_size = 10 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.995859] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.console_delay_seconds = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.996045] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.datastore_regex = ^datastore.* {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.996255] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.996431] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.host_password = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.996601] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.host_port = 443 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.996773] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.host_username = administrator@vsphere.local {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.996944] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.insecure = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.997145] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.integration_bridge = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.997318] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.maximum_objects = 100 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.997480] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.pbm_default_policy = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.997646] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.pbm_enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.997807] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.pbm_wsdl_location = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.997977] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.998150] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.serial_port_proxy_uri = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.998310] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.serial_port_service_uri = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.998479] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.task_poll_interval = 0.5 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.998652] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.use_linked_clone = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.998825] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.vnc_keymap = en-us {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.998992] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.vnc_port = 5900 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.999169] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vmware.vnc_port_total = 10000 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.999355] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.auth_schemes = ['none'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.999528] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 525.999824] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.000018] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.000195] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.novncproxy_port = 6080 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.000373] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.server_listen = 127.0.0.1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.000546] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.000708] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.vencrypt_ca_certs = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.000870] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.vencrypt_client_cert = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.001047] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vnc.vencrypt_client_key = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.001226] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.001391] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.disable_deep_image_inspection = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.001554] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.001717] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.001882] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.002054] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.disable_rootwrap = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.002222] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.enable_numa_live_migration = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.002383] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.002547] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.002709] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.002872] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.libvirt_disable_apic = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.003038] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.003204] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.003365] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.003525] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.003725] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.003946] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.004195] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.004436] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.004683] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.004951] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.005241] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.005478] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.client_socket_timeout = 900 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.005740] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.default_pool_size = 1000 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.005924] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.keep_alive = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.006115] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.max_header_line = 16384 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.006288] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.secure_proxy_ssl_header = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.006457] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.ssl_ca_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.006622] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.ssl_cert_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.006796] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.ssl_key_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.006962] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.tcp_keepidle = 600 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.007154] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.007326] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] zvm.ca_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.007490] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] zvm.cloud_connector_url = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.007775] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.007952] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] zvm.reachable_timeout = 300 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.008150] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.enforce_new_defaults = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.008325] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.enforce_scope = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.008500] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.policy_default_rule = default {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.008683] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.008863] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.policy_file = policy.yaml {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.009049] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.009216] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.009378] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.009539] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.009702] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.009873] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.010058] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.010239] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.connection_string = messaging:// {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.010406] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.enabled = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.010577] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.es_doc_type = notification {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.010743] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.es_scroll_size = 10000 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.010914] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.es_scroll_time = 2m {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.011089] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.filter_error_trace = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.011260] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.hmac_keys = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.011430] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.sentinel_service_name = mymaster {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.011597] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.socket_timeout = 0.1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.011759] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.trace_requests = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.011972] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler.trace_sqlalchemy = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.012600] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler_jaeger.process_tags = {} {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.012600] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler_jaeger.service_name_prefix = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.012600] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] profiler_otlp.service_name_prefix = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.012705] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] remote_debug.host = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.012820] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] remote_debug.port = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.012970] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.013153] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.013325] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.013487] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.013650] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.013836] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.014015] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.014187] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.014355] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.014515] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.014688] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.014858] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.015040] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.015214] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.015380] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.015556] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.015744] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.015918] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.016095] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.016263] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.016426] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.016595] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.016761] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.016925] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.017108] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.017279] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.ssl = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.017453] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.017625] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.017786] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.017956] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.018141] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_rabbit.ssl_version = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.018332] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.018499] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_notifications.retry = -1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.018682] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.018856] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_messaging_notifications.transport_url = **** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.019037] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.auth_section = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.019225] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.auth_type = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.019401] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.cafile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.019573] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.certfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.019740] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.collect_timing = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.019904] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.connect_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.020078] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.connect_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.020242] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.endpoint_id = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.020403] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.endpoint_override = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.020564] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.insecure = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.020727] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.keyfile = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.020883] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.max_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.021051] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.min_version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.021213] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.region_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.021371] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.service_name = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.021527] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.service_type = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.021688] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.split_loggers = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.021848] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.status_code_retries = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.022026] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.status_code_retry_delay = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.022178] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.timeout = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.022336] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.valid_interfaces = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.022493] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_limit.version = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.022657] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_reports.file_event_handler = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.022825] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.022987] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] oslo_reports.log_dir = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.023173] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.023335] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.023495] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.023664] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.023859] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.024036] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.024215] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.024379] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_ovs_privileged.group = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.024539] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.024705] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.024871] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.025041] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] vif_plug_ovs_privileged.user = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.025221] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_linux_bridge.flat_interface = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.025409] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.025582] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.025789] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.025971] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.026160] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.026331] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.026497] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.026689] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.026862] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_ovs.isolate_vif = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.027045] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.027218] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.027392] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.027568] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_ovs.ovsdb_interface = native {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.027734] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_vif_ovs.per_port_bridge = False {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.027904] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_brick.lock_path = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.028125] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.028294] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.028467] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] privsep_osbrick.capabilities = [21] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.028631] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] privsep_osbrick.group = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.028790] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] privsep_osbrick.helper_command = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.028961] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.029148] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.029324] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] privsep_osbrick.user = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.029502] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.029665] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] nova_sys_admin.group = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.029826] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] nova_sys_admin.helper_command = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.029996] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.030176] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.030336] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] nova_sys_admin.user = None {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 526.030466] env[67169]: DEBUG oslo_service.service [None req-308c8132-3775-436f-b13d-9d36bfd29bd0 None None] ******************************************************************************** {{(pid=67169) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 526.030896] env[67169]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 526.042016] env[67169]: WARNING nova.virt.vmwareapi.driver [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 526.042502] env[67169]: INFO nova.virt.node [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Generated node identity 6570906a-ac37-4859-b1f2-4bbacc48d3f3 [ 526.042736] env[67169]: INFO nova.virt.node [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Wrote node identity 6570906a-ac37-4859-b1f2-4bbacc48d3f3 to /opt/stack/data/n-cpu-1/compute_id [ 526.056072] env[67169]: WARNING nova.compute.manager [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Compute nodes ['6570906a-ac37-4859-b1f2-4bbacc48d3f3'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 526.091289] env[67169]: INFO nova.compute.manager [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 526.112328] env[67169]: WARNING nova.compute.manager [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 526.112572] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 526.112774] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 526.112925] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 526.113095] env[67169]: DEBUG nova.compute.resource_tracker [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 526.114411] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45970804-cc3e-4b7e-ab9f-b472af7290f3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.123493] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5912a4ba-853d-4d2c-9557-6351cbc58fc8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.137676] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e59af3c8-0397-4744-9055-01c5755b3c0a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.143962] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6652f87-d02e-4a0b-93f9-50c1160259a3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.173903] env[67169]: DEBUG nova.compute.resource_tracker [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181047MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 526.174021] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 526.174217] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 526.185513] env[67169]: WARNING nova.compute.resource_tracker [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] No compute node record for cpu-1:6570906a-ac37-4859-b1f2-4bbacc48d3f3: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 6570906a-ac37-4859-b1f2-4bbacc48d3f3 could not be found. [ 526.197697] env[67169]: INFO nova.compute.resource_tracker [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 [ 526.250352] env[67169]: DEBUG nova.compute.resource_tracker [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 526.250530] env[67169]: DEBUG nova.compute.resource_tracker [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 526.357625] env[67169]: INFO nova.scheduler.client.report [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] [req-0b9ff814-a33f-49eb-ac14-457e70dd012e] Created resource provider record via placement API for resource provider with UUID 6570906a-ac37-4859-b1f2-4bbacc48d3f3 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 526.375487] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac0c7e18-7e2f-4d36-bf83-66e6aae3ca33 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.383459] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a1d7a52-d69c-4457-bd83-31823f7286fa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.413647] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce8dfba6-11d4-4c0b-ab27-eb97228049ee {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.420662] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af35ae36-0228-4204-a67c-59b1c8801974 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.433177] env[67169]: DEBUG nova.compute.provider_tree [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Updating inventory in ProviderTree for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 526.471057] env[67169]: DEBUG nova.scheduler.client.report [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Updated inventory for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 526.471308] env[67169]: DEBUG nova.compute.provider_tree [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Updating resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 generation from 0 to 1 during operation: update_inventory {{(pid=67169) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 526.471452] env[67169]: DEBUG nova.compute.provider_tree [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Updating inventory in ProviderTree for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 526.521719] env[67169]: DEBUG nova.compute.provider_tree [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Updating resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 generation from 1 to 2 during operation: update_traits {{(pid=67169) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 526.539478] env[67169]: DEBUG nova.compute.resource_tracker [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 526.539672] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.365s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 526.539840] env[67169]: DEBUG nova.service [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Creating RPC server for service compute {{(pid=67169) start /opt/stack/nova/nova/service.py:182}} [ 526.552306] env[67169]: DEBUG nova.service [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] Join ServiceGroup membership for this service compute {{(pid=67169) start /opt/stack/nova/nova/service.py:199}} [ 526.552496] env[67169]: DEBUG nova.servicegroup.drivers.db [None req-3e2af13d-e616-4e41-9a9b-89f8b8cacde0 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=67169) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 535.863421] env[67169]: DEBUG dbcounter [-] [67169] Writing DB stats nova_cell0:SELECT=1 {{(pid=67169) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 535.864217] env[67169]: DEBUG dbcounter [-] [67169] Writing DB stats nova_cell1:SELECT=1 {{(pid=67169) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 570.432297] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquiring lock "9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 570.432642] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Lock "9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 570.458437] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 570.613961] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 570.614286] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 570.616647] env[67169]: INFO nova.compute.claims [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 570.832531] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c51026e0-ca2e-4d35-b7c3-aafdead9dba9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.841290] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24b76961-6de0-4069-a46e-af7409b4b731 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.874043] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquiring lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 570.875095] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 570.878680] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf6418e7-4c3f-42ee-a6b0-faeab9ca5bc8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.884562] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b522a62a-9b12-4498-94fe-a21073d46b9c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.902736] env[67169]: DEBUG nova.compute.provider_tree [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 570.905122] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 570.918989] env[67169]: DEBUG nova.scheduler.client.report [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 570.939324] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 570.939324] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 571.002696] env[67169]: DEBUG nova.compute.utils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 571.007279] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 571.007279] env[67169]: DEBUG nova.network.neutron [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 571.020133] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.020653] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.022615] env[67169]: INFO nova.compute.claims [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 571.029097] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 571.130336] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 571.141303] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa7f6082-0fa0-47d4-a100-53b0264d9f34 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.149369] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4707a662-3778-4e95-b290-db7da4b405d9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.189288] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 571.189531] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 571.189690] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 571.189871] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 571.190027] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 571.190185] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 571.190395] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 571.190595] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 571.190938] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 571.191123] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 571.191299] env[67169]: DEBUG nova.virt.hardware [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 571.192188] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6c99576-f9e4-46df-b4b8-c5798d710925 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.195912] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e8704e0-6ddd-4eee-86a5-3cfb14f06ad6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.203918] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7befc46e-2a20-43cb-90d2-ebf029ffc7e1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.219616] env[67169]: DEBUG nova.compute.provider_tree [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 571.224556] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b5caf23-9794-4b96-9496-edbe6336e60e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.229698] env[67169]: DEBUG nova.scheduler.client.report [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 571.245699] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2103873b-b238-4357-b586-e728416cdec4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.257868] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 571.258698] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 571.297685] env[67169]: DEBUG nova.compute.utils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 571.299355] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 571.299355] env[67169]: DEBUG nova.network.neutron [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 571.316861] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 571.359492] env[67169]: DEBUG nova.policy [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2182624460d8477abd47fb8e38326029', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d15d99857a34d1fb291eae17bbb1398', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 571.410205] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 571.435817] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 571.435817] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 571.435817] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 571.436229] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 571.436229] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 571.436229] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 571.436229] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 571.436229] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 571.436400] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 571.436400] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 571.436400] env[67169]: DEBUG nova.virt.hardware [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 571.436400] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2cb0153-294b-49f2-9afe-9da989062aef {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.444369] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a1d9537-a93a-4460-b39f-c6df81e8a844 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.618050] env[67169]: DEBUG nova.policy [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c046e349003249039db184cd6939387d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ad115c3eeb224f229bf4235e085fbfdf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 571.998579] env[67169]: DEBUG nova.network.neutron [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Successfully created port: b7af8248-64a8-42f3-80bc-b1b9813717ab {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 572.660025] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquiring lock "958ac621-d0c8-4c04-8a58-11ad0f3cf678" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 572.660025] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Lock "958ac621-d0c8-4c04-8a58-11ad0f3cf678" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 572.683454] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 572.775211] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 572.775211] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 572.776402] env[67169]: INFO nova.compute.claims [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 572.937041] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb72928d-8eb7-4736-8a94-a91c439f2696 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.948534] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03d3caf8-71c1-460b-9be7-faac981246a2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.989128] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28eb51a3-0a5a-4799-b803-6498e2e45308 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.996215] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ac12e11-fbd5-49f1-9969-3068781ee641 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.016884] env[67169]: DEBUG nova.compute.provider_tree [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 573.035527] env[67169]: DEBUG nova.scheduler.client.report [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 573.061588] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 573.062434] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 573.118829] env[67169]: DEBUG nova.compute.utils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 573.122601] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Not allocating networking since 'none' was specified. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 573.135987] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 573.258731] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 573.294722] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 573.295023] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 573.295152] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 573.295307] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 573.296019] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 573.296019] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 573.296019] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 573.296019] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 573.299264] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 573.299264] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 573.301542] env[67169]: DEBUG nova.virt.hardware [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 573.302082] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b58e1f7b-cda1-4871-80d2-9c17d1bcba0e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.305704] env[67169]: DEBUG nova.network.neutron [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Successfully updated port: b7af8248-64a8-42f3-80bc-b1b9813717ab {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 573.312478] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a588b437-0c0e-43b2-bfd9-258a28a70970 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.332812] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Instance VIF info [] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 573.343396] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 573.343396] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3a86f26e-f577-478b-8e65-d6e00337ea8c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.345691] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquiring lock "refresh_cache-9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 573.345825] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquired lock "refresh_cache-9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 573.345975] env[67169]: DEBUG nova.network.neutron [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 573.361043] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Created folder: OpenStack in parent group-v4. [ 573.361043] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Creating folder: Project (a5fde5b96d41419ebf8fd3eeb7feacf8). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 573.361043] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-32a59aeb-9e53-4f6d-873b-e8eede80f040 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.370265] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Created folder: Project (a5fde5b96d41419ebf8fd3eeb7feacf8) in parent group-v566843. [ 573.370822] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Creating folder: Instances. Parent ref: group-v566844. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 573.372507] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7c174710-3b94-4c92-988c-8dfcc598dab2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.383656] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Created folder: Instances in parent group-v566844. [ 573.383656] env[67169]: DEBUG oslo.service.loopingcall [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 573.383656] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 573.383984] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cf86ae4d-cdcf-4189-ad1c-99bd5c5bfd11 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.403683] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "ca7439e3-dbd5-4775-97e8-9927b325766a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.404112] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "ca7439e3-dbd5-4775-97e8-9927b325766a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 573.406773] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 573.406773] env[67169]: value = "task-2819048" [ 573.406773] env[67169]: _type = "Task" [ 573.406773] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 573.418343] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819048, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 573.418343] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 573.442728] env[67169]: DEBUG nova.network.neutron [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 573.488256] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.488541] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 573.493116] env[67169]: INFO nova.compute.claims [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 573.669221] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93a49678-224b-46ec-aedf-6b8d36204323 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.685776] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23295d34-6e39-4c24-a282-093747f243c7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.716873] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b3824c2-fb9b-4cca-925d-01c8b6a0fa5c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.724449] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae7fef29-7b6d-4a58-8559-2832d039b227 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.737924] env[67169]: DEBUG nova.compute.provider_tree [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 573.751015] env[67169]: DEBUG nova.scheduler.client.report [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 573.768789] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 573.769310] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 573.821735] env[67169]: DEBUG nova.compute.utils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 573.828792] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 573.828792] env[67169]: DEBUG nova.network.neutron [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 573.844611] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 573.915604] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819048, 'name': CreateVM_Task, 'duration_secs': 0.353366} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 573.915775] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 573.916809] env[67169]: DEBUG oslo_vmware.service [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8ba4938-eaaa-4b36-8dd6-df4c77ab4b01 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.922730] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 573.922888] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 573.923638] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 573.923859] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c5e0d73e-6ed6-413b-8958-69e3c1f7ef7e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.928673] env[67169]: DEBUG oslo_vmware.api [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Waiting for the task: (returnval){ [ 573.928673] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52bceb92-663a-7e41-de1c-e9a60023d860" [ 573.928673] env[67169]: _type = "Task" [ 573.928673] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 573.936483] env[67169]: DEBUG oslo_vmware.api [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52bceb92-663a-7e41-de1c-e9a60023d860, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 573.938915] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 573.950991] env[67169]: DEBUG nova.network.neutron [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Successfully created port: fc029e34-1d61-425c-aefa-9abfc9d55ff3 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 573.977945] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 573.977945] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 573.977945] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 573.978237] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 573.978237] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 573.978237] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 573.978237] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 573.980425] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 573.980425] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 573.980425] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 573.980425] env[67169]: DEBUG nova.virt.hardware [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 573.981796] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2de7099-e875-4cac-b1ed-cca663383853 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.994409] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7fba94b-8e08-48b0-8645-790c9dc8285f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.105408] env[67169]: DEBUG nova.network.neutron [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Updating instance_info_cache with network_info: [{"id": "b7af8248-64a8-42f3-80bc-b1b9813717ab", "address": "fa:16:3e:43:59:76", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.170", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb7af8248-64", "ovs_interfaceid": "b7af8248-64a8-42f3-80bc-b1b9813717ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 574.122955] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Releasing lock "refresh_cache-9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 574.122955] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Instance network_info: |[{"id": "b7af8248-64a8-42f3-80bc-b1b9813717ab", "address": "fa:16:3e:43:59:76", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.170", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb7af8248-64", "ovs_interfaceid": "b7af8248-64a8-42f3-80bc-b1b9813717ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 574.123303] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:43:59:76', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c9bc2632-36f9-4912-8782-8bbb789f909d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b7af8248-64a8-42f3-80bc-b1b9813717ab', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 574.131440] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Creating folder: Project (6d15d99857a34d1fb291eae17bbb1398). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 574.132147] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5e48ab9a-11b4-4415-8883-1be151cdb38e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.143090] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Created folder: Project (6d15d99857a34d1fb291eae17bbb1398) in parent group-v566843. [ 574.143324] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Creating folder: Instances. Parent ref: group-v566847. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 574.143494] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d86ebb1-a763-4787-a399-1aea5b86692e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.152049] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Created folder: Instances in parent group-v566847. [ 574.152049] env[67169]: DEBUG oslo.service.loopingcall [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 574.152533] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 574.152533] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3d0d68b1-6f4c-4275-bde5-08399df89992 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.170685] env[67169]: DEBUG nova.policy [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '91c0570c9cec4ba2a7d248a66b2f70d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '202ec4287c3042809c86951050621ffc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 574.173355] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 574.173355] env[67169]: value = "task-2819051" [ 574.173355] env[67169]: _type = "Task" [ 574.173355] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 574.182078] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819051, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 574.441535] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 574.441771] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 574.442316] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 574.442316] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 574.442607] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 574.442836] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0e5f7588-fadd-43d7-8f70-cf3a00c923d8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.464223] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 574.464223] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 574.464223] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d169ed6-a196-40d6-848c-4c5ae4ca6dc9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.473995] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8efd0cfd-37a3-4193-8422-11b30cc0aa70 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.483270] env[67169]: DEBUG oslo_vmware.api [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Waiting for the task: (returnval){ [ 574.483270] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5285e83f-d08b-b23f-cde0-1e3197135eee" [ 574.483270] env[67169]: _type = "Task" [ 574.483270] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 574.492392] env[67169]: DEBUG oslo_vmware.api [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5285e83f-d08b-b23f-cde0-1e3197135eee, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 574.689068] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819051, 'name': CreateVM_Task, 'duration_secs': 0.33874} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 574.691859] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 574.721419] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 574.721996] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 574.722458] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 574.723447] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c8503188-fa23-4d78-9870-a3bf1fe8e77f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.728936] env[67169]: DEBUG oslo_vmware.api [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Waiting for the task: (returnval){ [ 574.728936] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5225f54d-c9dd-6994-0477-b1987be7b25d" [ 574.728936] env[67169]: _type = "Task" [ 574.728936] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 574.742424] env[67169]: DEBUG oslo_vmware.api [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5225f54d-c9dd-6994-0477-b1987be7b25d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 574.992808] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 574.993070] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Creating directory with path [datastore2] vmware_temp/e7231a6a-b295-405c-a42d-6faa5d22e7b0/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 574.993384] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-33005e87-77fc-403e-9ec9-07a7885816d6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.018413] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Created directory with path [datastore2] vmware_temp/e7231a6a-b295-405c-a42d-6faa5d22e7b0/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 575.018670] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Fetch image to [datastore2] vmware_temp/e7231a6a-b295-405c-a42d-6faa5d22e7b0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 575.018820] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/e7231a6a-b295-405c-a42d-6faa5d22e7b0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 575.019680] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04bfab75-2088-4883-b2e8-ed181dbbe61c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.031194] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0b4435e-5cc8-4cca-8adb-da874320b196 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.049484] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9398782d-d8e1-412a-9e20-ffdf6f50b2b7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.092050] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd69c89a-9d5b-463d-9ec7-ac8f4425c646 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.098769] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-489f2837-ad24-4a7c-9666-29998f807212 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.130389] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 575.206485] env[67169]: DEBUG oslo_vmware.rw_handles [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e7231a6a-b295-405c-a42d-6faa5d22e7b0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 575.270994] env[67169]: DEBUG oslo_vmware.rw_handles [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 575.271886] env[67169]: DEBUG oslo_vmware.rw_handles [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e7231a6a-b295-405c-a42d-6faa5d22e7b0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 575.276010] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 575.276531] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 575.276752] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 575.487616] env[67169]: DEBUG nova.network.neutron [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Successfully created port: 495134d4-26bc-4696-ade1-a578d27d382b {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 576.639792] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquiring lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 576.640541] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 576.671454] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 576.735127] env[67169]: DEBUG nova.compute.manager [req-525fe1bb-e217-4690-9f30-5450289dbbab req-eb0964e9-fe1d-42f2-8130-aece0e0e8942 service nova] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Received event network-vif-plugged-b7af8248-64a8-42f3-80bc-b1b9813717ab {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 576.735375] env[67169]: DEBUG oslo_concurrency.lockutils [req-525fe1bb-e217-4690-9f30-5450289dbbab req-eb0964e9-fe1d-42f2-8130-aece0e0e8942 service nova] Acquiring lock "9e0a990e-d9ad-4dae-9e2d-6d1f7d999403-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 576.735758] env[67169]: DEBUG oslo_concurrency.lockutils [req-525fe1bb-e217-4690-9f30-5450289dbbab req-eb0964e9-fe1d-42f2-8130-aece0e0e8942 service nova] Lock "9e0a990e-d9ad-4dae-9e2d-6d1f7d999403-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 576.735758] env[67169]: DEBUG oslo_concurrency.lockutils [req-525fe1bb-e217-4690-9f30-5450289dbbab req-eb0964e9-fe1d-42f2-8130-aece0e0e8942 service nova] Lock "9e0a990e-d9ad-4dae-9e2d-6d1f7d999403-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 576.735916] env[67169]: DEBUG nova.compute.manager [req-525fe1bb-e217-4690-9f30-5450289dbbab req-eb0964e9-fe1d-42f2-8130-aece0e0e8942 service nova] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] No waiting events found dispatching network-vif-plugged-b7af8248-64a8-42f3-80bc-b1b9813717ab {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 576.737714] env[67169]: WARNING nova.compute.manager [req-525fe1bb-e217-4690-9f30-5450289dbbab req-eb0964e9-fe1d-42f2-8130-aece0e0e8942 service nova] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Received unexpected event network-vif-plugged-b7af8248-64a8-42f3-80bc-b1b9813717ab for instance with vm_state building and task_state spawning. [ 576.782341] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 576.783183] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 576.786563] env[67169]: INFO nova.compute.claims [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 577.005288] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba99e736-23c1-4bf2-b79d-f9057e2fea87 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.015320] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-288065d0-01eb-43df-a927-6564263a3252 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.061787] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e4d692c-a3ff-4302-82a3-806370eff357 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.070741] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57e77285-7626-4240-b0c4-274c646b9620 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.090369] env[67169]: DEBUG nova.compute.provider_tree [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 577.106816] env[67169]: DEBUG nova.scheduler.client.report [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 577.129221] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.346s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 577.130367] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 577.188033] env[67169]: DEBUG nova.compute.utils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 577.188033] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 577.188033] env[67169]: DEBUG nova.network.neutron [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 577.204342] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 577.279143] env[67169]: DEBUG nova.network.neutron [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Successfully updated port: fc029e34-1d61-425c-aefa-9abfc9d55ff3 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 577.299211] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquiring lock "refresh_cache-68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 577.299430] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquired lock "refresh_cache-68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 577.299684] env[67169]: DEBUG nova.network.neutron [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 577.324697] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 577.370345] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 577.370632] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 577.370756] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 577.371060] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 577.375107] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 577.375107] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 577.375107] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 577.375107] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 577.375107] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 577.375532] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 577.375532] env[67169]: DEBUG nova.virt.hardware [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 577.375603] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3425feb9-3c3f-4629-9896-192a389df306 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.388377] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67283190-7b2c-44ee-993c-7ee3a652a79b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.464212] env[67169]: DEBUG nova.policy [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e69ca428041144cfabe598d1cdb9bd8f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a4b6efcbf27141c7bee056e8f28d732d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 577.676486] env[67169]: DEBUG nova.network.neutron [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 577.706829] env[67169]: DEBUG nova.network.neutron [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Successfully updated port: 495134d4-26bc-4696-ade1-a578d27d382b {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 577.721927] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "refresh_cache-ca7439e3-dbd5-4775-97e8-9927b325766a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 577.722113] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquired lock "refresh_cache-ca7439e3-dbd5-4775-97e8-9927b325766a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 577.722268] env[67169]: DEBUG nova.network.neutron [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 577.853179] env[67169]: DEBUG nova.network.neutron [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 578.457470] env[67169]: DEBUG nova.network.neutron [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Updating instance_info_cache with network_info: [{"id": "495134d4-26bc-4696-ade1-a578d27d382b", "address": "fa:16:3e:53:2f:f2", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap495134d4-26", "ovs_interfaceid": "495134d4-26bc-4696-ade1-a578d27d382b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 578.476158] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Releasing lock "refresh_cache-ca7439e3-dbd5-4775-97e8-9927b325766a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 578.476469] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Instance network_info: |[{"id": "495134d4-26bc-4696-ade1-a578d27d382b", "address": "fa:16:3e:53:2f:f2", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap495134d4-26", "ovs_interfaceid": "495134d4-26bc-4696-ade1-a578d27d382b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 578.476856] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:53:2f:f2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c9bc2632-36f9-4912-8782-8bbb789f909d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '495134d4-26bc-4696-ade1-a578d27d382b', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 578.490265] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Creating folder: Project (202ec4287c3042809c86951050621ffc). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 578.490265] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d55f0430-f4cd-4ae2-b57b-52cbc5ad898e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.501441] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Created folder: Project (202ec4287c3042809c86951050621ffc) in parent group-v566843. [ 578.501636] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Creating folder: Instances. Parent ref: group-v566850. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 578.501874] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9f669c4a-a9e0-4358-b96f-550031f18113 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.510723] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Created folder: Instances in parent group-v566850. [ 578.510961] env[67169]: DEBUG oslo.service.loopingcall [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 578.513116] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 578.513116] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f8db225b-0e05-4d0f-ae92-9816e53eb17e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.530473] env[67169]: DEBUG nova.network.neutron [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Successfully created port: ad705aee-0261-4e31-8077-3894179a56a3 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 578.536110] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 578.536110] env[67169]: value = "task-2819054" [ 578.536110] env[67169]: _type = "Task" [ 578.536110] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 578.545870] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819054, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 579.047083] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819054, 'name': CreateVM_Task, 'duration_secs': 0.305098} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 579.047083] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 579.048417] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 579.048599] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 579.048939] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 579.049217] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-69b23beb-d562-498a-8192-a2c3602f29b8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.054440] env[67169]: DEBUG oslo_vmware.api [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for the task: (returnval){ [ 579.054440] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52351659-697d-42e7-691a-62342e0c6860" [ 579.054440] env[67169]: _type = "Task" [ 579.054440] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 579.061899] env[67169]: DEBUG nova.network.neutron [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Updating instance_info_cache with network_info: [{"id": "fc029e34-1d61-425c-aefa-9abfc9d55ff3", "address": "fa:16:3e:5f:2f:7f", "network": {"id": "3e2e473c-02a3-456e-965d-245899ebb61c", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1872181337-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ad115c3eeb224f229bf4235e085fbfdf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5e839c46-1ae9-43b7-9518-8f18f48100dd", "external-id": "nsx-vlan-transportzone-666", "segmentation_id": 666, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfc029e34-1d", "ovs_interfaceid": "fc029e34-1d61-425c-aefa-9abfc9d55ff3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 579.070365] env[67169]: DEBUG oslo_vmware.api [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52351659-697d-42e7-691a-62342e0c6860, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 579.077701] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Releasing lock "refresh_cache-68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 579.078069] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Instance network_info: |[{"id": "fc029e34-1d61-425c-aefa-9abfc9d55ff3", "address": "fa:16:3e:5f:2f:7f", "network": {"id": "3e2e473c-02a3-456e-965d-245899ebb61c", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1872181337-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ad115c3eeb224f229bf4235e085fbfdf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5e839c46-1ae9-43b7-9518-8f18f48100dd", "external-id": "nsx-vlan-transportzone-666", "segmentation_id": 666, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfc029e34-1d", "ovs_interfaceid": "fc029e34-1d61-425c-aefa-9abfc9d55ff3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 579.078743] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5f:2f:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5e839c46-1ae9-43b7-9518-8f18f48100dd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fc029e34-1d61-425c-aefa-9abfc9d55ff3', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 579.087353] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Creating folder: Project (ad115c3eeb224f229bf4235e085fbfdf). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 579.087847] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b44913bb-536e-4865-9142-72000e5a0732 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.098683] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Created folder: Project (ad115c3eeb224f229bf4235e085fbfdf) in parent group-v566843. [ 579.098890] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Creating folder: Instances. Parent ref: group-v566853. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 579.099238] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a9104f02-4089-4d49-8687-84470a03cc01 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.110780] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Created folder: Instances in parent group-v566853. [ 579.111082] env[67169]: DEBUG oslo.service.loopingcall [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 579.111476] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 579.111586] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f27fa64f-b1a5-45b2-83d7-93a468001ab7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.133398] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 579.133398] env[67169]: value = "task-2819057" [ 579.133398] env[67169]: _type = "Task" [ 579.133398] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 579.141619] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819057, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 579.574954] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 579.575271] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 579.575810] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 579.649713] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819057, 'name': CreateVM_Task} progress is 99%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 579.813610] env[67169]: DEBUG nova.network.neutron [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Successfully updated port: ad705aee-0261-4e31-8077-3894179a56a3 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 579.822778] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquiring lock "refresh_cache-f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 579.822836] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquired lock "refresh_cache-f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 579.822951] env[67169]: DEBUG nova.network.neutron [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 579.907143] env[67169]: DEBUG nova.network.neutron [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 580.145101] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819057, 'name': CreateVM_Task} progress is 99%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 580.429782] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquiring lock "11e90c91-26ca-4397-81a4-975a1d714d19" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.429782] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "11e90c91-26ca-4397-81a4-975a1d714d19" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.447060] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 580.502290] env[67169]: DEBUG nova.network.neutron [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Updating instance_info_cache with network_info: [{"id": "ad705aee-0261-4e31-8077-3894179a56a3", "address": "fa:16:3e:5c:9e:9c", "network": {"id": "00259946-9c33-41e4-a705-bcd33b92337c", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1592441058-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a4b6efcbf27141c7bee056e8f28d732d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3734b156-0f7d-4721-b23c-d000412ec2eb", "external-id": "nsx-vlan-transportzone-560", "segmentation_id": 560, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad705aee-02", "ovs_interfaceid": "ad705aee-0261-4e31-8077-3894179a56a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 580.518711] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Releasing lock "refresh_cache-f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 580.519141] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Instance network_info: |[{"id": "ad705aee-0261-4e31-8077-3894179a56a3", "address": "fa:16:3e:5c:9e:9c", "network": {"id": "00259946-9c33-41e4-a705-bcd33b92337c", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1592441058-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a4b6efcbf27141c7bee056e8f28d732d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3734b156-0f7d-4721-b23c-d000412ec2eb", "external-id": "nsx-vlan-transportzone-560", "segmentation_id": 560, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad705aee-02", "ovs_interfaceid": "ad705aee-0261-4e31-8077-3894179a56a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 580.519507] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5c:9e:9c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3734b156-0f7d-4721-b23c-d000412ec2eb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ad705aee-0261-4e31-8077-3894179a56a3', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 580.531205] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Creating folder: Project (a4b6efcbf27141c7bee056e8f28d732d). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 580.532846] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.533091] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.534929] env[67169]: INFO nova.compute.claims [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 580.538487] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b6ee32fe-49b0-439a-854a-ba573d803513 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.552753] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Created folder: Project (a4b6efcbf27141c7bee056e8f28d732d) in parent group-v566843. [ 580.552961] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Creating folder: Instances. Parent ref: group-v566856. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 580.554185] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-95d8253e-8776-4509-8040-958e23553136 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.565694] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Created folder: Instances in parent group-v566856. [ 580.565980] env[67169]: DEBUG oslo.service.loopingcall [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 580.566137] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 580.568017] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aec4a8aa-5263-407c-acd4-e6d41ea3a8f2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.594134] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 580.594134] env[67169]: value = "task-2819060" [ 580.594134] env[67169]: _type = "Task" [ 580.594134] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 580.602209] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819060, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 580.650542] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819057, 'name': CreateVM_Task, 'duration_secs': 1.405942} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 580.653285] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 580.654148] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 580.654311] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 580.654631] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 580.655205] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-88be0e60-62d0-493a-9447-086932e95c52 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.662655] env[67169]: DEBUG oslo_vmware.api [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Waiting for the task: (returnval){ [ 580.662655] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a6d699-5cf3-5edc-c158-c0374752db08" [ 580.662655] env[67169]: _type = "Task" [ 580.662655] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 580.674776] env[67169]: DEBUG oslo_vmware.api [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a6d699-5cf3-5edc-c158-c0374752db08, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 580.748179] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed62ff1a-594a-4092-8d97-f629005a11d7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.757309] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-280d85ff-5f20-4392-ab12-8da0eff300a7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.812891] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3208612-ac77-4728-bc4a-06664ea4a76d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.830033] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20b9f8aa-24e6-48bd-aa7b-82d53cd20cff {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.855998] env[67169]: DEBUG nova.compute.provider_tree [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 580.885127] env[67169]: DEBUG nova.scheduler.client.report [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 580.903662] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.904213] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 580.975093] env[67169]: DEBUG nova.compute.utils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 580.976216] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 580.976319] env[67169]: DEBUG nova.network.neutron [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 580.992029] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 581.065953] env[67169]: DEBUG nova.compute.manager [req-ad421e8e-3f3a-4db1-8b19-24c6cb57ffcc req-a757be86-2ae7-425a-8dfa-bfac55cc5017 service nova] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Received event network-vif-plugged-fc029e34-1d61-425c-aefa-9abfc9d55ff3 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 581.068301] env[67169]: DEBUG oslo_concurrency.lockutils [req-ad421e8e-3f3a-4db1-8b19-24c6cb57ffcc req-a757be86-2ae7-425a-8dfa-bfac55cc5017 service nova] Acquiring lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 581.068530] env[67169]: DEBUG oslo_concurrency.lockutils [req-ad421e8e-3f3a-4db1-8b19-24c6cb57ffcc req-a757be86-2ae7-425a-8dfa-bfac55cc5017 service nova] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 581.068709] env[67169]: DEBUG oslo_concurrency.lockutils [req-ad421e8e-3f3a-4db1-8b19-24c6cb57ffcc req-a757be86-2ae7-425a-8dfa-bfac55cc5017 service nova] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 581.068883] env[67169]: DEBUG nova.compute.manager [req-ad421e8e-3f3a-4db1-8b19-24c6cb57ffcc req-a757be86-2ae7-425a-8dfa-bfac55cc5017 service nova] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] No waiting events found dispatching network-vif-plugged-fc029e34-1d61-425c-aefa-9abfc9d55ff3 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 581.069059] env[67169]: WARNING nova.compute.manager [req-ad421e8e-3f3a-4db1-8b19-24c6cb57ffcc req-a757be86-2ae7-425a-8dfa-bfac55cc5017 service nova] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Received unexpected event network-vif-plugged-fc029e34-1d61-425c-aefa-9abfc9d55ff3 for instance with vm_state building and task_state spawning. [ 581.084470] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 581.105598] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819060, 'name': CreateVM_Task, 'duration_secs': 0.316146} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 581.105598] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 581.106249] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 581.122941] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 581.123206] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 581.123369] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 581.123589] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 581.123744] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 581.123900] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 581.127872] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 581.128343] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 581.128343] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 581.128667] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 581.128667] env[67169]: DEBUG nova.virt.hardware [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 581.130271] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd4c7181-8397-4434-badd-252ba683d38c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.134293] env[67169]: DEBUG nova.policy [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a148871206247ea8fa8f6ceae924586', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f6f398d7f457401bae611864b041480b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 581.142370] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad69973e-e5af-4319-8482-26fc4125fea7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.176212] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 581.176505] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 581.176733] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 581.176921] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 581.177240] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 581.177494] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-55e0e1b0-b28e-4d60-9074-769ffb613f0c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.183671] env[67169]: DEBUG oslo_vmware.api [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Waiting for the task: (returnval){ [ 581.183671] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5250b7f4-d547-064d-c585-ac1c583175bd" [ 581.183671] env[67169]: _type = "Task" [ 581.183671] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.194342] env[67169]: DEBUG oslo_vmware.api [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5250b7f4-d547-064d-c585-ac1c583175bd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 581.697330] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 581.698220] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 581.698220] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 581.707427] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquiring lock "36781827-5846-49a4-8913-d98676af0b74" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 581.707427] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "36781827-5846-49a4-8913-d98676af0b74" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 581.721091] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 581.803774] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 581.804058] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 581.805625] env[67169]: INFO nova.compute.claims [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 582.022944] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3ecd0b5-9fe0-49d8-8136-4f47a6d53801 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.033056] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f2c4641-fca7-4a94-899f-a5927358c558 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.073430] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31edbd28-c51e-474c-ae36-2b5f6adc804e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.083189] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf7103a1-8ef6-4c5d-ac5f-07ba53b8b3bf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.096252] env[67169]: DEBUG nova.compute.provider_tree [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 582.107073] env[67169]: DEBUG nova.scheduler.client.report [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 582.123515] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.319s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 582.124024] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 582.170397] env[67169]: DEBUG nova.compute.utils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 582.175323] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 582.175323] env[67169]: DEBUG nova.network.neutron [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 582.190164] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 582.274574] env[67169]: DEBUG nova.compute.manager [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Received event network-changed-b7af8248-64a8-42f3-80bc-b1b9813717ab {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 582.274788] env[67169]: DEBUG nova.compute.manager [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Refreshing instance network info cache due to event network-changed-b7af8248-64a8-42f3-80bc-b1b9813717ab. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 582.274989] env[67169]: DEBUG oslo_concurrency.lockutils [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] Acquiring lock "refresh_cache-9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.275992] env[67169]: DEBUG oslo_concurrency.lockutils [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] Acquired lock "refresh_cache-9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 582.276204] env[67169]: DEBUG nova.network.neutron [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Refreshing network info cache for port b7af8248-64a8-42f3-80bc-b1b9813717ab {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 582.283377] env[67169]: DEBUG nova.policy [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ab5614b68701476c828ab7b8f1e5a481', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '77c22397eae5494baab363e296329d7e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 582.303373] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 582.344782] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 582.344782] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 582.344782] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 582.345044] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 582.345044] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 582.345044] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 582.345044] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 582.345044] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 582.345201] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 582.345201] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 582.345369] env[67169]: DEBUG nova.virt.hardware [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 582.346568] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e48a7867-1028-4294-a3c9-90ae494446ab {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.357730] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d9f350c-bfd7-4db9-a676-8fae0f4918a6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.578205] env[67169]: DEBUG nova.network.neutron [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Successfully created port: 8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 583.259777] env[67169]: DEBUG nova.network.neutron [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Updated VIF entry in instance network info cache for port b7af8248-64a8-42f3-80bc-b1b9813717ab. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 583.259777] env[67169]: DEBUG nova.network.neutron [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Updating instance_info_cache with network_info: [{"id": "b7af8248-64a8-42f3-80bc-b1b9813717ab", "address": "fa:16:3e:43:59:76", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.170", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb7af8248-64", "ovs_interfaceid": "b7af8248-64a8-42f3-80bc-b1b9813717ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.283053] env[67169]: DEBUG oslo_concurrency.lockutils [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] Releasing lock "refresh_cache-9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 583.283835] env[67169]: DEBUG nova.compute.manager [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Received event network-vif-plugged-495134d4-26bc-4696-ade1-a578d27d382b {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 583.284537] env[67169]: DEBUG oslo_concurrency.lockutils [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] Acquiring lock "ca7439e3-dbd5-4775-97e8-9927b325766a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.285170] env[67169]: DEBUG oslo_concurrency.lockutils [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] Lock "ca7439e3-dbd5-4775-97e8-9927b325766a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.285482] env[67169]: DEBUG oslo_concurrency.lockutils [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] Lock "ca7439e3-dbd5-4775-97e8-9927b325766a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 583.286273] env[67169]: DEBUG nova.compute.manager [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] No waiting events found dispatching network-vif-plugged-495134d4-26bc-4696-ade1-a578d27d382b {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 583.286897] env[67169]: WARNING nova.compute.manager [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Received unexpected event network-vif-plugged-495134d4-26bc-4696-ade1-a578d27d382b for instance with vm_state building and task_state spawning. [ 583.289285] env[67169]: DEBUG nova.compute.manager [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Received event network-changed-495134d4-26bc-4696-ade1-a578d27d382b {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 583.289285] env[67169]: DEBUG nova.compute.manager [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Refreshing instance network info cache due to event network-changed-495134d4-26bc-4696-ade1-a578d27d382b. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 583.289285] env[67169]: DEBUG oslo_concurrency.lockutils [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] Acquiring lock "refresh_cache-ca7439e3-dbd5-4775-97e8-9927b325766a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 583.289285] env[67169]: DEBUG oslo_concurrency.lockutils [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] Acquired lock "refresh_cache-ca7439e3-dbd5-4775-97e8-9927b325766a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 583.289285] env[67169]: DEBUG nova.network.neutron [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Refreshing network info cache for port 495134d4-26bc-4696-ade1-a578d27d382b {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 583.562344] env[67169]: DEBUG nova.network.neutron [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Successfully created port: c7de1a8c-b1a6-4a34-9641-75295e554f2e {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 583.864805] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquiring lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.865110] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.884705] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 583.969895] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.969895] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.974424] env[67169]: INFO nova.compute.claims [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 584.200652] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bb38452-5017-4a50-a14d-18a9d0262b19 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.212057] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3a462b8-8ad5-4deb-8cb2-6267de3d4d1f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.245374] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99e491fc-8999-45cd-97cb-2b09e5c4a20a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.253935] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aefbfef-b8de-413c-88bf-e5e6316d740a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.274657] env[67169]: DEBUG nova.compute.provider_tree [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 584.284176] env[67169]: DEBUG nova.scheduler.client.report [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 584.311031] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.343s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 584.311031] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 584.363115] env[67169]: DEBUG nova.compute.utils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 584.364709] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 584.368262] env[67169]: DEBUG nova.network.neutron [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 584.379681] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 584.486540] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 584.499015] env[67169]: DEBUG nova.policy [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5af808760795416f98374db84df00215', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f87ea93e96194031b75ed324e0acc94d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 584.521187] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 584.521187] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 584.521187] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 584.521591] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 584.521591] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 584.521591] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 584.521591] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 584.521749] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 584.522439] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 584.522803] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 584.523121] env[67169]: DEBUG nova.virt.hardware [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 584.524238] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdda7715-9093-452b-bf28-3b032ca65980 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.534780] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7f0a281-28e3-4b7f-9bac-d93fc747b1e3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.772656] env[67169]: DEBUG nova.network.neutron [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Successfully updated port: c7de1a8c-b1a6-4a34-9641-75295e554f2e {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 584.796106] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquiring lock "refresh_cache-36781827-5846-49a4-8913-d98676af0b74" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 584.796106] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquired lock "refresh_cache-36781827-5846-49a4-8913-d98676af0b74" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 584.796867] env[67169]: DEBUG nova.network.neutron [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 584.889129] env[67169]: DEBUG nova.network.neutron [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 585.041130] env[67169]: DEBUG nova.network.neutron [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Updated VIF entry in instance network info cache for port 495134d4-26bc-4696-ade1-a578d27d382b. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 585.041496] env[67169]: DEBUG nova.network.neutron [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Updating instance_info_cache with network_info: [{"id": "495134d4-26bc-4696-ade1-a578d27d382b", "address": "fa:16:3e:53:2f:f2", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap495134d4-26", "ovs_interfaceid": "495134d4-26bc-4696-ade1-a578d27d382b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 585.059819] env[67169]: DEBUG oslo_concurrency.lockutils [req-b378090b-a17d-4c2c-a38f-92288130f0e5 req-c09b0dad-bb56-46ed-a621-dc9d3ca8b0e2 service nova] Releasing lock "refresh_cache-ca7439e3-dbd5-4775-97e8-9927b325766a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 585.432799] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "85978a3b-052a-4a05-84e6-75c723d49bd8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.433102] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.448899] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 585.474348] env[67169]: DEBUG nova.network.neutron [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Successfully updated port: 8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 585.490423] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquiring lock "refresh_cache-11e90c91-26ca-4397-81a4-975a1d714d19" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.491165] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquired lock "refresh_cache-11e90c91-26ca-4397-81a4-975a1d714d19" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.491165] env[67169]: DEBUG nova.network.neutron [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 585.521592] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.521892] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.525527] env[67169]: INFO nova.compute.claims [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 585.530141] env[67169]: DEBUG nova.network.neutron [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Updating instance_info_cache with network_info: [{"id": "c7de1a8c-b1a6-4a34-9641-75295e554f2e", "address": "fa:16:3e:8c:e4:ff", "network": {"id": "9ac75aef-d146-4422-a66f-17031648021e", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-976127093-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "77c22397eae5494baab363e296329d7e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7de1a8c-b1", "ovs_interfaceid": "c7de1a8c-b1a6-4a34-9641-75295e554f2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 585.546595] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Releasing lock "refresh_cache-36781827-5846-49a4-8913-d98676af0b74" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 585.549502] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Instance network_info: |[{"id": "c7de1a8c-b1a6-4a34-9641-75295e554f2e", "address": "fa:16:3e:8c:e4:ff", "network": {"id": "9ac75aef-d146-4422-a66f-17031648021e", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-976127093-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "77c22397eae5494baab363e296329d7e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7de1a8c-b1", "ovs_interfaceid": "c7de1a8c-b1a6-4a34-9641-75295e554f2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 585.551050] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8c:e4:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '19598cc1-e105-4565-906a-09dde75e3fbe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c7de1a8c-b1a6-4a34-9641-75295e554f2e', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 585.563225] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Creating folder: Project (77c22397eae5494baab363e296329d7e). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 585.564213] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.564697] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c0ff1ba5-473c-4124-be28-a1ae1411343a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.567844] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.568084] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 585.568249] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 585.581038] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Created folder: Project (77c22397eae5494baab363e296329d7e) in parent group-v566843. [ 585.581367] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Creating folder: Instances. Parent ref: group-v566859. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 585.581822] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-22158fda-82b5-4522-999a-b3950f30e327 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.591817] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Created folder: Instances in parent group-v566859. [ 585.593134] env[67169]: DEBUG oslo.service.loopingcall [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 585.593134] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 36781827-5846-49a4-8913-d98676af0b74] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 585.595015] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-71b0ca1a-1fdf-49e0-8340-93cab8f7b241 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.622407] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 585.622571] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 585.623143] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 585.623143] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 585.623143] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 585.623143] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 585.623460] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 36781827-5846-49a4-8913-d98676af0b74] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 585.623460] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 585.623460] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 585.627099] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.627397] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.627876] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.627944] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.629051] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.629051] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.631139] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 585.631139] env[67169]: value = "task-2819063" [ 585.631139] env[67169]: _type = "Task" [ 585.631139] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 585.638156] env[67169]: DEBUG nova.network.neutron [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 585.643668] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819063, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 585.658747] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Getting list of instances from cluster (obj){ [ 585.658747] env[67169]: value = "domain-c8" [ 585.658747] env[67169]: _type = "ClusterComputeResource" [ 585.658747] env[67169]: } {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 585.661182] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39de312f-a5ba-46a0-98ea-95924dd1a193 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.680260] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Got total of 5 instances {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 585.680260] env[67169]: WARNING nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] While synchronizing instance power states, found 9 instances in the database and 5 instances on the hypervisor. [ 585.680260] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 585.680669] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 585.680669] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 958ac621-d0c8-4c04-8a58-11ad0f3cf678 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 585.680783] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid ca7439e3-dbd5-4775-97e8-9927b325766a {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 585.681280] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid f5558a78-c91f-4c36-bb22-f94b1bd8cdbc {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 585.681280] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 11e90c91-26ca-4397-81a4-975a1d714d19 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 585.681280] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 36781827-5846-49a4-8913-d98676af0b74 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 585.681474] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 835bf8da-8d8f-4dfd-b0a9-fab02796f39e {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 585.681636] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 85978a3b-052a-4a05-84e6-75c723d49bd8 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 585.684750] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.686216] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.686216] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "958ac621-d0c8-4c04-8a58-11ad0f3cf678" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.686216] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "ca7439e3-dbd5-4775-97e8-9927b325766a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.686216] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.686505] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "11e90c91-26ca-4397-81a4-975a1d714d19" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.686505] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "36781827-5846-49a4-8913-d98676af0b74" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.686505] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.686505] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "85978a3b-052a-4a05-84e6-75c723d49bd8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.686633] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.686910] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 585.687551] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 585.701723] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.836695] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4a0344e-c35b-478d-89eb-d754d678e212 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.845237] env[67169]: DEBUG nova.compute.manager [req-7ebdb467-9368-4fe3-bf27-3d099eb2b172 req-37eb9872-4e21-4729-852f-a09b6aefecc5 service nova] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Received event network-changed-fc029e34-1d61-425c-aefa-9abfc9d55ff3 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 585.845364] env[67169]: DEBUG nova.compute.manager [req-7ebdb467-9368-4fe3-bf27-3d099eb2b172 req-37eb9872-4e21-4729-852f-a09b6aefecc5 service nova] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Refreshing instance network info cache due to event network-changed-fc029e34-1d61-425c-aefa-9abfc9d55ff3. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 585.845649] env[67169]: DEBUG oslo_concurrency.lockutils [req-7ebdb467-9368-4fe3-bf27-3d099eb2b172 req-37eb9872-4e21-4729-852f-a09b6aefecc5 service nova] Acquiring lock "refresh_cache-68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.845803] env[67169]: DEBUG oslo_concurrency.lockutils [req-7ebdb467-9368-4fe3-bf27-3d099eb2b172 req-37eb9872-4e21-4729-852f-a09b6aefecc5 service nova] Acquired lock "refresh_cache-68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.845980] env[67169]: DEBUG nova.network.neutron [req-7ebdb467-9368-4fe3-bf27-3d099eb2b172 req-37eb9872-4e21-4729-852f-a09b6aefecc5 service nova] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Refreshing network info cache for port fc029e34-1d61-425c-aefa-9abfc9d55ff3 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 585.854661] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cd5dfdb-1c35-41f0-ab73-94ae0e9121c9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.891586] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1c5d52d-81ed-4942-b12e-2cf3ae12e845 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.900295] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2039ec1-292d-48cc-96d3-5196b16c8d6a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.914734] env[67169]: DEBUG nova.compute.provider_tree [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 585.924767] env[67169]: DEBUG nova.scheduler.client.report [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 585.957748] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.435s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 585.958038] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 585.960712] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.259s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.960901] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 585.961073] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 585.962673] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6283bc43-e2ab-4e47-ac1d-9e742ba9e94a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.973045] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-504d0f05-7b5f-4a29-a7da-7357a583bbba {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.994862] env[67169]: DEBUG nova.network.neutron [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Successfully created port: 51962410-9c63-4dd9-bd3f-7bd3f0d51122 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 585.999532] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd958804-346d-42ff-9314-74a7620ef152 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.008377] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c19b912-2185-4f12-b993-e73b921ad747 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.014419] env[67169]: DEBUG nova.compute.utils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 586.016228] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 586.017943] env[67169]: DEBUG nova.network.neutron [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 586.046656] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181012MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 586.046744] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 586.046938] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 586.050763] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 586.141770] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819063, 'name': CreateVM_Task, 'duration_secs': 0.358181} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 586.142524] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 36781827-5846-49a4-8913-d98676af0b74] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 586.147568] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 586.147568] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 586.147691] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 586.149463] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 586.151723] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3d98e7a1-99b0-4a8c-8816-eaad43eb3010 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.157021] env[67169]: DEBUG oslo_vmware.api [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Waiting for the task: (returnval){ [ 586.157021] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a9d6cc-dd59-81c8-34d3-8293f71c25f1" [ 586.157021] env[67169]: _type = "Task" [ 586.157021] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 586.166019] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 586.166181] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 586.166321] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 958ac621-d0c8-4c04-8a58-11ad0f3cf678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 586.166475] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ca7439e3-dbd5-4775-97e8-9927b325766a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 586.166611] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f5558a78-c91f-4c36-bb22-f94b1bd8cdbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 586.166731] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 11e90c91-26ca-4397-81a4-975a1d714d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 586.166847] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 36781827-5846-49a4-8913-d98676af0b74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 586.166963] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 586.167096] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 85978a3b-052a-4a05-84e6-75c723d49bd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 586.167384] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 586.167605] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 586.179988] env[67169]: DEBUG oslo_vmware.api [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a9d6cc-dd59-81c8-34d3-8293f71c25f1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 586.186985] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 586.187265] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 586.187491] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 586.187901] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 586.187901] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 586.187996] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 586.188202] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 586.188392] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 586.188529] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 586.188692] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 586.188861] env[67169]: DEBUG nova.virt.hardware [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 586.190943] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fc93f48-0394-4ba2-9d61-90b72e5d10f6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.198466] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27694fe5-a027-4750-a969-852b5ed98821 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.332755] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d07d8e4f-0942-45d2-bf76-c33f72fd11e1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.342647] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cbd4336-5e04-4b17-81d7-1dab4f940a05 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.349252] env[67169]: DEBUG nova.compute.manager [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Received event network-vif-plugged-ad705aee-0261-4e31-8077-3894179a56a3 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 586.349770] env[67169]: DEBUG oslo_concurrency.lockutils [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] Acquiring lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 586.350067] env[67169]: DEBUG oslo_concurrency.lockutils [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 586.350254] env[67169]: DEBUG oslo_concurrency.lockutils [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 586.350468] env[67169]: DEBUG nova.compute.manager [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] No waiting events found dispatching network-vif-plugged-ad705aee-0261-4e31-8077-3894179a56a3 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 586.350837] env[67169]: WARNING nova.compute.manager [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Received unexpected event network-vif-plugged-ad705aee-0261-4e31-8077-3894179a56a3 for instance with vm_state building and task_state spawning. [ 586.350925] env[67169]: DEBUG nova.compute.manager [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Received event network-changed-ad705aee-0261-4e31-8077-3894179a56a3 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 586.351162] env[67169]: DEBUG nova.compute.manager [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Refreshing instance network info cache due to event network-changed-ad705aee-0261-4e31-8077-3894179a56a3. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 586.351416] env[67169]: DEBUG oslo_concurrency.lockutils [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] Acquiring lock "refresh_cache-f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 586.351496] env[67169]: DEBUG oslo_concurrency.lockutils [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] Acquired lock "refresh_cache-f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 586.351657] env[67169]: DEBUG nova.network.neutron [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Refreshing network info cache for port ad705aee-0261-4e31-8077-3894179a56a3 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 586.393735] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-375dee2b-aa18-49b1-b309-bbcd9f843134 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.412120] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12e0dd7b-fe32-4862-83c1-dd53a3d64c98 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.428813] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 586.445932] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 586.487038] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 586.489017] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.440s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 586.493016] env[67169]: DEBUG nova.policy [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '615c1061ae884c3b91ce1b072249717c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1162bad4f2e4722aed4ff2c657e9dc9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 586.494589] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 586.498098] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Getting list of instances from cluster (obj){ [ 586.498098] env[67169]: value = "domain-c8" [ 586.498098] env[67169]: _type = "ClusterComputeResource" [ 586.498098] env[67169]: } {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 586.499975] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aa12e6b-e0d2-43da-847a-e24d4fa39629 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.519510] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Got total of 6 instances {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 586.521753] env[67169]: DEBUG nova.network.neutron [req-7ebdb467-9368-4fe3-bf27-3d099eb2b172 req-37eb9872-4e21-4729-852f-a09b6aefecc5 service nova] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Updated VIF entry in instance network info cache for port fc029e34-1d61-425c-aefa-9abfc9d55ff3. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 586.521884] env[67169]: DEBUG nova.network.neutron [req-7ebdb467-9368-4fe3-bf27-3d099eb2b172 req-37eb9872-4e21-4729-852f-a09b6aefecc5 service nova] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Updating instance_info_cache with network_info: [{"id": "fc029e34-1d61-425c-aefa-9abfc9d55ff3", "address": "fa:16:3e:5f:2f:7f", "network": {"id": "3e2e473c-02a3-456e-965d-245899ebb61c", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1872181337-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ad115c3eeb224f229bf4235e085fbfdf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5e839c46-1ae9-43b7-9518-8f18f48100dd", "external-id": "nsx-vlan-transportzone-666", "segmentation_id": 666, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfc029e34-1d", "ovs_interfaceid": "fc029e34-1d61-425c-aefa-9abfc9d55ff3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 586.535865] env[67169]: DEBUG oslo_concurrency.lockutils [req-7ebdb467-9368-4fe3-bf27-3d099eb2b172 req-37eb9872-4e21-4729-852f-a09b6aefecc5 service nova] Releasing lock "refresh_cache-68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 586.671269] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 586.671506] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 586.671788] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 586.676544] env[67169]: DEBUG nova.network.neutron [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Updating instance_info_cache with network_info: [{"id": "8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8", "address": "fa:16:3e:7e:9c:35", "network": {"id": "bce6a929-07ab-4b0f-b086-50bdaa278431", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1367241572-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f6f398d7f457401bae611864b041480b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c4bb9ee-02", "ovs_interfaceid": "8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 586.693187] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Releasing lock "refresh_cache-11e90c91-26ca-4397-81a4-975a1d714d19" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 586.693520] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Instance network_info: |[{"id": "8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8", "address": "fa:16:3e:7e:9c:35", "network": {"id": "bce6a929-07ab-4b0f-b086-50bdaa278431", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1367241572-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f6f398d7f457401bae611864b041480b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c4bb9ee-02", "ovs_interfaceid": "8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 586.693920] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7e:9c:35', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a4b6ddb2-2e19-4031-9b22-add90d41a114', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 586.704752] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Creating folder: Project (f6f398d7f457401bae611864b041480b). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 586.705643] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b0fbe9a3-901b-4e7e-b273-c9cbc626933d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.716967] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Created folder: Project (f6f398d7f457401bae611864b041480b) in parent group-v566843. [ 586.717241] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Creating folder: Instances. Parent ref: group-v566862. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 586.717545] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ebd8b50-3851-4e9a-b313-b434c6f1acf8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.726381] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Created folder: Instances in parent group-v566862. [ 586.726634] env[67169]: DEBUG oslo.service.loopingcall [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 586.726837] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 586.727144] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ad483c38-8264-470b-bf59-dfda9574373b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.751233] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 586.751233] env[67169]: value = "task-2819066" [ 586.751233] env[67169]: _type = "Task" [ 586.751233] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 586.763808] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819066, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 587.154493] env[67169]: DEBUG nova.network.neutron [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Updated VIF entry in instance network info cache for port ad705aee-0261-4e31-8077-3894179a56a3. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 587.154493] env[67169]: DEBUG nova.network.neutron [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Updating instance_info_cache with network_info: [{"id": "ad705aee-0261-4e31-8077-3894179a56a3", "address": "fa:16:3e:5c:9e:9c", "network": {"id": "00259946-9c33-41e4-a705-bcd33b92337c", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1592441058-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a4b6efcbf27141c7bee056e8f28d732d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3734b156-0f7d-4721-b23c-d000412ec2eb", "external-id": "nsx-vlan-transportzone-560", "segmentation_id": 560, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad705aee-02", "ovs_interfaceid": "ad705aee-0261-4e31-8077-3894179a56a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 587.163088] env[67169]: DEBUG oslo_concurrency.lockutils [req-cadd8598-f9d4-424f-8dda-deabede2e7a1 req-dfbdc44d-4dba-4dc5-9090-c2384c8bc564 service nova] Releasing lock "refresh_cache-f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 587.809032] env[67169]: DEBUG nova.network.neutron [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Successfully created port: 1d1a691a-2602-4187-9819-c1ea4583b421 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 587.818765] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819066, 'name': CreateVM_Task} progress is 99%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 587.819972] env[67169]: WARNING oslo_vmware.common.loopingcall [-] task run outlasted interval by 0.06669099999999994 sec [ 587.830600] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819066, 'name': CreateVM_Task} progress is 99%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 588.333942] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819066, 'name': CreateVM_Task, 'duration_secs': 1.353714} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 588.334262] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 588.335922] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 588.336760] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 588.337768] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 588.338276] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d9342f99-692b-4047-a0a2-24b3d20dfa5b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 588.347903] env[67169]: DEBUG oslo_vmware.api [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Waiting for the task: (returnval){ [ 588.347903] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]526ca1d9-f09a-0092-dc69-bfa1307c829a" [ 588.347903] env[67169]: _type = "Task" [ 588.347903] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 588.363288] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 588.363712] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 588.364056] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 588.401440] env[67169]: DEBUG nova.network.neutron [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Successfully updated port: 51962410-9c63-4dd9-bd3f-7bd3f0d51122 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 588.416318] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquiring lock "refresh_cache-835bf8da-8d8f-4dfd-b0a9-fab02796f39e" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 588.416462] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquired lock "refresh_cache-835bf8da-8d8f-4dfd-b0a9-fab02796f39e" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 588.416571] env[67169]: DEBUG nova.network.neutron [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 588.493299] env[67169]: DEBUG nova.network.neutron [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 589.544338] env[67169]: DEBUG nova.network.neutron [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Updating instance_info_cache with network_info: [{"id": "51962410-9c63-4dd9-bd3f-7bd3f0d51122", "address": "fa:16:3e:71:fe:88", "network": {"id": "17df7149-0d5d-49b5-b846-c2f7cd59a89a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-661207153-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f87ea93e96194031b75ed324e0acc94d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9be6786-e9a7-4138-b7b5-b7696f6cb1e1", "external-id": "nsx-vlan-transportzone-626", "segmentation_id": 626, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap51962410-9c", "ovs_interfaceid": "51962410-9c63-4dd9-bd3f-7bd3f0d51122", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 589.563017] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Releasing lock "refresh_cache-835bf8da-8d8f-4dfd-b0a9-fab02796f39e" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 589.565114] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Instance network_info: |[{"id": "51962410-9c63-4dd9-bd3f-7bd3f0d51122", "address": "fa:16:3e:71:fe:88", "network": {"id": "17df7149-0d5d-49b5-b846-c2f7cd59a89a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-661207153-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f87ea93e96194031b75ed324e0acc94d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9be6786-e9a7-4138-b7b5-b7696f6cb1e1", "external-id": "nsx-vlan-transportzone-626", "segmentation_id": 626, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap51962410-9c", "ovs_interfaceid": "51962410-9c63-4dd9-bd3f-7bd3f0d51122", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 589.565235] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:71:fe:88', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f9be6786-e9a7-4138-b7b5-b7696f6cb1e1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '51962410-9c63-4dd9-bd3f-7bd3f0d51122', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 589.573360] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Creating folder: Project (f87ea93e96194031b75ed324e0acc94d). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 589.575174] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b5b4899f-34c0-45c4-9dc5-4f9c92641cb6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.586943] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Created folder: Project (f87ea93e96194031b75ed324e0acc94d) in parent group-v566843. [ 589.587175] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Creating folder: Instances. Parent ref: group-v566865. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 589.587455] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-368ab188-de4a-473c-89d6-59246da18756 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.597024] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Created folder: Instances in parent group-v566865. [ 589.597312] env[67169]: DEBUG oslo.service.loopingcall [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 589.597552] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 589.598054] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ba1c005c-308c-4525-bbfe-4eb9052d5853 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.620140] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 589.620140] env[67169]: value = "task-2819069" [ 589.620140] env[67169]: _type = "Task" [ 589.620140] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 589.631041] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819069, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 590.133808] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819069, 'name': CreateVM_Task, 'duration_secs': 0.327242} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 590.134452] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 590.139256] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 590.139256] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 590.139256] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 590.139256] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-48c784c9-c594-45b6-81f8-604b12873a76 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.142680] env[67169]: DEBUG oslo_vmware.api [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Waiting for the task: (returnval){ [ 590.142680] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52d93dc9-cc7f-4dc9-b08c-a9b2358e08cb" [ 590.142680] env[67169]: _type = "Task" [ 590.142680] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 590.153263] env[67169]: DEBUG oslo_vmware.api [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52d93dc9-cc7f-4dc9-b08c-a9b2358e08cb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 590.186716] env[67169]: DEBUG nova.network.neutron [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Successfully updated port: 1d1a691a-2602-4187-9819-c1ea4583b421 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 590.206501] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "refresh_cache-85978a3b-052a-4a05-84e6-75c723d49bd8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 590.206848] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "refresh_cache-85978a3b-052a-4a05-84e6-75c723d49bd8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 590.207068] env[67169]: DEBUG nova.network.neutron [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 590.312748] env[67169]: DEBUG nova.network.neutron [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 590.524625] env[67169]: DEBUG nova.compute.manager [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] [instance: 36781827-5846-49a4-8913-d98676af0b74] Received event network-vif-plugged-c7de1a8c-b1a6-4a34-9641-75295e554f2e {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 590.525108] env[67169]: DEBUG oslo_concurrency.lockutils [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] Acquiring lock "36781827-5846-49a4-8913-d98676af0b74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 590.525108] env[67169]: DEBUG oslo_concurrency.lockutils [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] Lock "36781827-5846-49a4-8913-d98676af0b74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 590.525364] env[67169]: DEBUG oslo_concurrency.lockutils [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] Lock "36781827-5846-49a4-8913-d98676af0b74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 590.525580] env[67169]: DEBUG nova.compute.manager [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] [instance: 36781827-5846-49a4-8913-d98676af0b74] No waiting events found dispatching network-vif-plugged-c7de1a8c-b1a6-4a34-9641-75295e554f2e {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 590.525768] env[67169]: WARNING nova.compute.manager [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] [instance: 36781827-5846-49a4-8913-d98676af0b74] Received unexpected event network-vif-plugged-c7de1a8c-b1a6-4a34-9641-75295e554f2e for instance with vm_state building and task_state spawning. [ 590.526255] env[67169]: DEBUG nova.compute.manager [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] [instance: 36781827-5846-49a4-8913-d98676af0b74] Received event network-changed-c7de1a8c-b1a6-4a34-9641-75295e554f2e {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 590.526255] env[67169]: DEBUG nova.compute.manager [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] [instance: 36781827-5846-49a4-8913-d98676af0b74] Refreshing instance network info cache due to event network-changed-c7de1a8c-b1a6-4a34-9641-75295e554f2e. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 590.526393] env[67169]: DEBUG oslo_concurrency.lockutils [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] Acquiring lock "refresh_cache-36781827-5846-49a4-8913-d98676af0b74" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 590.526566] env[67169]: DEBUG oslo_concurrency.lockutils [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] Acquired lock "refresh_cache-36781827-5846-49a4-8913-d98676af0b74" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 590.526752] env[67169]: DEBUG nova.network.neutron [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] [instance: 36781827-5846-49a4-8913-d98676af0b74] Refreshing network info cache for port c7de1a8c-b1a6-4a34-9641-75295e554f2e {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 590.661634] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 590.661634] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 590.661634] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 590.698807] env[67169]: DEBUG nova.compute.manager [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Received event network-vif-plugged-8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 590.698807] env[67169]: DEBUG oslo_concurrency.lockutils [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] Acquiring lock "11e90c91-26ca-4397-81a4-975a1d714d19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 590.700129] env[67169]: DEBUG oslo_concurrency.lockutils [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] Lock "11e90c91-26ca-4397-81a4-975a1d714d19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 590.700558] env[67169]: DEBUG oslo_concurrency.lockutils [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] Lock "11e90c91-26ca-4397-81a4-975a1d714d19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 590.700914] env[67169]: DEBUG nova.compute.manager [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] No waiting events found dispatching network-vif-plugged-8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 590.701442] env[67169]: WARNING nova.compute.manager [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Received unexpected event network-vif-plugged-8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8 for instance with vm_state building and task_state spawning. [ 590.701953] env[67169]: DEBUG nova.compute.manager [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Received event network-changed-8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 590.702863] env[67169]: DEBUG nova.compute.manager [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Refreshing instance network info cache due to event network-changed-8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 590.702863] env[67169]: DEBUG oslo_concurrency.lockutils [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] Acquiring lock "refresh_cache-11e90c91-26ca-4397-81a4-975a1d714d19" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 590.702863] env[67169]: DEBUG oslo_concurrency.lockutils [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] Acquired lock "refresh_cache-11e90c91-26ca-4397-81a4-975a1d714d19" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 590.703625] env[67169]: DEBUG nova.network.neutron [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Refreshing network info cache for port 8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 590.872587] env[67169]: DEBUG nova.network.neutron [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Updating instance_info_cache with network_info: [{"id": "1d1a691a-2602-4187-9819-c1ea4583b421", "address": "fa:16:3e:33:2d:f7", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d1a691a-26", "ovs_interfaceid": "1d1a691a-2602-4187-9819-c1ea4583b421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 590.891926] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "refresh_cache-85978a3b-052a-4a05-84e6-75c723d49bd8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 590.891926] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Instance network_info: |[{"id": "1d1a691a-2602-4187-9819-c1ea4583b421", "address": "fa:16:3e:33:2d:f7", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d1a691a-26", "ovs_interfaceid": "1d1a691a-2602-4187-9819-c1ea4583b421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 590.892909] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:33:2d:f7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '24210a23-d8ac-4f4f-84ac-dc0636de9a72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1d1a691a-2602-4187-9819-c1ea4583b421', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 590.902798] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating folder: Project (b1162bad4f2e4722aed4ff2c657e9dc9). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 590.903476] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fcf15c8b-b097-4e01-b06d-41c7143973e3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.915397] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created folder: Project (b1162bad4f2e4722aed4ff2c657e9dc9) in parent group-v566843. [ 590.915889] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating folder: Instances. Parent ref: group-v566868. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 590.916338] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cc59ab3a-46ac-4636-8fb2-88c0698c0387 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.926944] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created folder: Instances in parent group-v566868. [ 590.927321] env[67169]: DEBUG oslo.service.loopingcall [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 590.931365] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 590.931613] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cccc65cb-4464-4761-ab2a-f3bac3915034 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.957121] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 590.957121] env[67169]: value = "task-2819072" [ 590.957121] env[67169]: _type = "Task" [ 590.957121] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 590.964815] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819072, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 591.464459] env[67169]: DEBUG nova.network.neutron [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] [instance: 36781827-5846-49a4-8913-d98676af0b74] Updated VIF entry in instance network info cache for port c7de1a8c-b1a6-4a34-9641-75295e554f2e. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 591.464459] env[67169]: DEBUG nova.network.neutron [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] [instance: 36781827-5846-49a4-8913-d98676af0b74] Updating instance_info_cache with network_info: [{"id": "c7de1a8c-b1a6-4a34-9641-75295e554f2e", "address": "fa:16:3e:8c:e4:ff", "network": {"id": "9ac75aef-d146-4422-a66f-17031648021e", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-976127093-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "77c22397eae5494baab363e296329d7e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7de1a8c-b1", "ovs_interfaceid": "c7de1a8c-b1a6-4a34-9641-75295e554f2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 591.472565] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819072, 'name': CreateVM_Task, 'duration_secs': 0.395629} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 591.472723] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 591.474450] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 591.474450] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 591.474450] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 591.474450] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-497382e8-316e-432d-93cb-dc764cdbf448 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 591.478889] env[67169]: DEBUG oslo_vmware.api [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 591.478889] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ac2043-680b-9302-64c1-959cc883964f" [ 591.478889] env[67169]: _type = "Task" [ 591.478889] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 591.486536] env[67169]: DEBUG oslo_concurrency.lockutils [req-d152c18a-f30b-4665-bc75-b2ef6348ae45 req-ac19bc16-03a1-4aef-8bf7-a2677f559502 service nova] Releasing lock "refresh_cache-36781827-5846-49a4-8913-d98676af0b74" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 591.493287] env[67169]: DEBUG oslo_vmware.api [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ac2043-680b-9302-64c1-959cc883964f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 591.627470] env[67169]: DEBUG nova.network.neutron [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Updated VIF entry in instance network info cache for port 8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 591.628286] env[67169]: DEBUG nova.network.neutron [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Updating instance_info_cache with network_info: [{"id": "8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8", "address": "fa:16:3e:7e:9c:35", "network": {"id": "bce6a929-07ab-4b0f-b086-50bdaa278431", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1367241572-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f6f398d7f457401bae611864b041480b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c4bb9ee-02", "ovs_interfaceid": "8c4bb9ee-0205-4da0-8ec4-d483b70d9fb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 591.641236] env[67169]: DEBUG oslo_concurrency.lockutils [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] Releasing lock "refresh_cache-11e90c91-26ca-4397-81a4-975a1d714d19" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 591.641493] env[67169]: DEBUG nova.compute.manager [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Received event network-vif-plugged-51962410-9c63-4dd9-bd3f-7bd3f0d51122 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 591.641711] env[67169]: DEBUG oslo_concurrency.lockutils [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] Acquiring lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 591.641994] env[67169]: DEBUG oslo_concurrency.lockutils [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 591.642161] env[67169]: DEBUG oslo_concurrency.lockutils [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 591.642329] env[67169]: DEBUG nova.compute.manager [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] No waiting events found dispatching network-vif-plugged-51962410-9c63-4dd9-bd3f-7bd3f0d51122 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 591.642498] env[67169]: WARNING nova.compute.manager [req-55d0dc4f-c27a-4244-898b-f33a53dea7b4 req-cccec276-1f64-4598-849d-e873fd052d16 service nova] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Received unexpected event network-vif-plugged-51962410-9c63-4dd9-bd3f-7bd3f0d51122 for instance with vm_state building and task_state spawning. [ 591.995266] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 591.995266] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 591.995266] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.166950] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "e2e52693-153a-43dd-b786-dd0758caabe2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 594.167294] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "e2e52693-153a-43dd-b786-dd0758caabe2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 594.181924] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 594.254113] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 594.254113] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 594.254113] env[67169]: INFO nova.compute.claims [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 594.298422] env[67169]: DEBUG nova.compute.manager [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Received event network-changed-51962410-9c63-4dd9-bd3f-7bd3f0d51122 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 594.298648] env[67169]: DEBUG nova.compute.manager [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Refreshing instance network info cache due to event network-changed-51962410-9c63-4dd9-bd3f-7bd3f0d51122. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 594.298920] env[67169]: DEBUG oslo_concurrency.lockutils [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] Acquiring lock "refresh_cache-835bf8da-8d8f-4dfd-b0a9-fab02796f39e" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.299073] env[67169]: DEBUG oslo_concurrency.lockutils [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] Acquired lock "refresh_cache-835bf8da-8d8f-4dfd-b0a9-fab02796f39e" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 594.299237] env[67169]: DEBUG nova.network.neutron [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Refreshing network info cache for port 51962410-9c63-4dd9-bd3f-7bd3f0d51122 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 594.469638] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2df33468-3b8f-4568-a068-d070079236bb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.478128] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4e8c07b-aea9-479d-a342-26fa0565f6e6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.510033] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f771e440-8d4c-4bdc-a1ba-a0fed78f16fe {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.517798] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31f843a7-d2ec-495e-84a5-05e2f53729cd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.533288] env[67169]: DEBUG nova.compute.provider_tree [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 594.543343] env[67169]: DEBUG nova.scheduler.client.report [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 594.559259] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 594.559765] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 594.614020] env[67169]: DEBUG nova.compute.utils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 594.616096] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Not allocating networking since 'none' was specified. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 594.630954] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 594.734302] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 594.769989] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 594.770350] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 594.771580] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 594.772155] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 594.772155] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 594.772155] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 594.772862] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 594.773100] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 594.773267] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 594.773431] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 594.773605] env[67169]: DEBUG nova.virt.hardware [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 594.777637] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10657b47-daaf-48ca-8de6-ce6b6c657574 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.786245] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcc375b5-7989-46ba-b7db-f567eb7ff8d5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.808264] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Instance VIF info [] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 594.814245] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Creating folder: Project (ae1539dd7099411db426b3fc34954e98). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 594.814912] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8586db09-7d8d-475e-aed6-cd975aa52f68 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.824268] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Created folder: Project (ae1539dd7099411db426b3fc34954e98) in parent group-v566843. [ 594.826221] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Creating folder: Instances. Parent ref: group-v566871. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 594.826221] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8139791e-d9bd-47e5-b715-0c95ac13423b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.836418] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Created folder: Instances in parent group-v566871. [ 594.837415] env[67169]: DEBUG oslo.service.loopingcall [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 594.837415] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 594.837542] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8a22d8d2-b7b9-43aa-b32f-a140749d3798 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.857082] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 594.857082] env[67169]: value = "task-2819075" [ 594.857082] env[67169]: _type = "Task" [ 594.857082] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 594.866155] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819075, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 595.046168] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquiring lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 595.046668] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 595.143065] env[67169]: DEBUG nova.network.neutron [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Updated VIF entry in instance network info cache for port 51962410-9c63-4dd9-bd3f-7bd3f0d51122. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 595.143065] env[67169]: DEBUG nova.network.neutron [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Updating instance_info_cache with network_info: [{"id": "51962410-9c63-4dd9-bd3f-7bd3f0d51122", "address": "fa:16:3e:71:fe:88", "network": {"id": "17df7149-0d5d-49b5-b846-c2f7cd59a89a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-661207153-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f87ea93e96194031b75ed324e0acc94d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9be6786-e9a7-4138-b7b5-b7696f6cb1e1", "external-id": "nsx-vlan-transportzone-626", "segmentation_id": 626, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap51962410-9c", "ovs_interfaceid": "51962410-9c63-4dd9-bd3f-7bd3f0d51122", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 595.159419] env[67169]: DEBUG oslo_concurrency.lockutils [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] Releasing lock "refresh_cache-835bf8da-8d8f-4dfd-b0a9-fab02796f39e" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 595.159635] env[67169]: DEBUG nova.compute.manager [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Received event network-vif-plugged-1d1a691a-2602-4187-9819-c1ea4583b421 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 595.159700] env[67169]: DEBUG oslo_concurrency.lockutils [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] Acquiring lock "85978a3b-052a-4a05-84e6-75c723d49bd8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 595.159842] env[67169]: DEBUG oslo_concurrency.lockutils [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 595.160131] env[67169]: DEBUG oslo_concurrency.lockutils [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 595.160330] env[67169]: DEBUG nova.compute.manager [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] No waiting events found dispatching network-vif-plugged-1d1a691a-2602-4187-9819-c1ea4583b421 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 595.160494] env[67169]: WARNING nova.compute.manager [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Received unexpected event network-vif-plugged-1d1a691a-2602-4187-9819-c1ea4583b421 for instance with vm_state building and task_state spawning. [ 595.165202] env[67169]: DEBUG nova.compute.manager [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Received event network-changed-1d1a691a-2602-4187-9819-c1ea4583b421 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 595.165452] env[67169]: DEBUG nova.compute.manager [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Refreshing instance network info cache due to event network-changed-1d1a691a-2602-4187-9819-c1ea4583b421. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 595.165701] env[67169]: DEBUG oslo_concurrency.lockutils [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] Acquiring lock "refresh_cache-85978a3b-052a-4a05-84e6-75c723d49bd8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 595.165807] env[67169]: DEBUG oslo_concurrency.lockutils [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] Acquired lock "refresh_cache-85978a3b-052a-4a05-84e6-75c723d49bd8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 595.166287] env[67169]: DEBUG nova.network.neutron [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Refreshing network info cache for port 1d1a691a-2602-4187-9819-c1ea4583b421 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 595.368523] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819075, 'name': CreateVM_Task, 'duration_secs': 0.414129} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 595.368523] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 595.369225] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 595.369225] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 595.369419] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 595.369700] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7059fb92-a633-42bd-a69e-3292d5ace798 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 595.374411] env[67169]: DEBUG oslo_vmware.api [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Waiting for the task: (returnval){ [ 595.374411] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52569b06-c177-ea0b-f3e9-8f1d1b872e53" [ 595.374411] env[67169]: _type = "Task" [ 595.374411] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 595.388103] env[67169]: DEBUG oslo_vmware.api [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52569b06-c177-ea0b-f3e9-8f1d1b872e53, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 595.890952] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 595.891443] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 595.891948] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 596.910536] env[67169]: DEBUG nova.network.neutron [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Updated VIF entry in instance network info cache for port 1d1a691a-2602-4187-9819-c1ea4583b421. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 596.912368] env[67169]: DEBUG nova.network.neutron [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Updating instance_info_cache with network_info: [{"id": "1d1a691a-2602-4187-9819-c1ea4583b421", "address": "fa:16:3e:33:2d:f7", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d1a691a-26", "ovs_interfaceid": "1d1a691a-2602-4187-9819-c1ea4583b421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 596.923258] env[67169]: DEBUG oslo_concurrency.lockutils [req-53c33ee2-eba8-465d-a3fd-3bbeeb223f20 req-3fb40696-f5ad-4d49-b3df-857acd85f11f service nova] Releasing lock "refresh_cache-85978a3b-052a-4a05-84e6-75c723d49bd8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 597.606734] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquiring lock "28552f70-695d-40cc-8dfa-bf40d6113220" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 597.607219] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "28552f70-695d-40cc-8dfa-bf40d6113220" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 598.293362] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquiring lock "7a42aeb9-0518-448d-a3a6-8e68d6497922" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 598.293723] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 598.715334] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquiring lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 598.715334] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 600.350452] env[67169]: DEBUG oslo_concurrency.lockutils [None req-421b8c1a-27a4-4ab8-86b0-47b05f78ea77 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquiring lock "3ac7cc70-7667-43b0-a3b8-0c791ef7ccd2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 600.350760] env[67169]: DEBUG oslo_concurrency.lockutils [None req-421b8c1a-27a4-4ab8-86b0-47b05f78ea77 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "3ac7cc70-7667-43b0-a3b8-0c791ef7ccd2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 600.931786] env[67169]: DEBUG oslo_concurrency.lockutils [None req-b97c6bbc-2563-48b6-bf6d-a435a705c13f tempest-ServerActionsTestJSON-1763696561 tempest-ServerActionsTestJSON-1763696561-project-member] Acquiring lock "485cf92f-cf20-4f94-8a18-1a82501a829f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 600.932037] env[67169]: DEBUG oslo_concurrency.lockutils [None req-b97c6bbc-2563-48b6-bf6d-a435a705c13f tempest-ServerActionsTestJSON-1763696561 tempest-ServerActionsTestJSON-1763696561-project-member] Lock "485cf92f-cf20-4f94-8a18-1a82501a829f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.127171] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5838ee6b-fce2-451c-88ae-bf4e0492c429 tempest-InstanceActionsV221TestJSON-96340564 tempest-InstanceActionsV221TestJSON-96340564-project-member] Acquiring lock "a6427d1b-e915-4e3a-a4dd-6758fde2bc56" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 603.127421] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5838ee6b-fce2-451c-88ae-bf4e0492c429 tempest-InstanceActionsV221TestJSON-96340564 tempest-InstanceActionsV221TestJSON-96340564-project-member] Lock "a6427d1b-e915-4e3a-a4dd-6758fde2bc56" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 605.502190] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca250af0-c6e9-4d98-90c3-c0b84601a3ce tempest-ServerActionsTestOtherA-909278241 tempest-ServerActionsTestOtherA-909278241-project-member] Acquiring lock "bee054d2-ac3c-47cc-a946-90bebf23f925" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 605.502554] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca250af0-c6e9-4d98-90c3-c0b84601a3ce tempest-ServerActionsTestOtherA-909278241 tempest-ServerActionsTestOtherA-909278241-project-member] Lock "bee054d2-ac3c-47cc-a946-90bebf23f925" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 606.446719] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2f6d228f-1abd-4d97-91fe-a0b7fbfa7070 tempest-ServerDiagnosticsTest-1296817303 tempest-ServerDiagnosticsTest-1296817303-project-member] Acquiring lock "e7195659-b834-49e4-a9bd-6b2b7c7d4a20" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 606.446964] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2f6d228f-1abd-4d97-91fe-a0b7fbfa7070 tempest-ServerDiagnosticsTest-1296817303 tempest-ServerDiagnosticsTest-1296817303-project-member] Lock "e7195659-b834-49e4-a9bd-6b2b7c7d4a20" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 606.760909] env[67169]: DEBUG oslo_concurrency.lockutils [None req-b1b25f70-abf0-437d-a7db-a2d66ddc73cb tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] Acquiring lock "f001b21d-59fb-4a9e-9c28-7c15892facfa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 606.761232] env[67169]: DEBUG oslo_concurrency.lockutils [None req-b1b25f70-abf0-437d-a7db-a2d66ddc73cb tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] Lock "f001b21d-59fb-4a9e-9c28-7c15892facfa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 607.381866] env[67169]: DEBUG oslo_concurrency.lockutils [None req-18fa1f7c-17e9-4b29-85ee-a4ba563bb86f tempest-ServerDiagnosticsNegativeTest-1101645295 tempest-ServerDiagnosticsNegativeTest-1101645295-project-member] Acquiring lock "a7165020-4b1d-44e5-83d9-53eafbef74e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 607.382135] env[67169]: DEBUG oslo_concurrency.lockutils [None req-18fa1f7c-17e9-4b29-85ee-a4ba563bb86f tempest-ServerDiagnosticsNegativeTest-1101645295 tempest-ServerDiagnosticsNegativeTest-1101645295-project-member] Lock "a7165020-4b1d-44e5-83d9-53eafbef74e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 608.456608] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6086110c-8031-47b1-b002-03e1428679d4 tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] Acquiring lock "45ce5bfe-4a85-4c26-914e-b85478fc45a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 608.456976] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6086110c-8031-47b1-b002-03e1428679d4 tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] Lock "45ce5bfe-4a85-4c26-914e-b85478fc45a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 616.279725] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f3f0ffd-b8a1-48f6-873e-a0c5292157ed tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] Acquiring lock "529ac98d-1e5c-4bcd-bb3d-7a7158e952cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 616.279725] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f3f0ffd-b8a1-48f6-873e-a0c5292157ed tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] Lock "529ac98d-1e5c-4bcd-bb3d-7a7158e952cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 618.020704] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4c32cf04-40f3-4f28-9587-80c7302bd097 tempest-ImagesOneServerTestJSON-1174559401 tempest-ImagesOneServerTestJSON-1174559401-project-member] Acquiring lock "774358a1-c887-497e-b2d8-59a7c10e2329" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 618.021476] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4c32cf04-40f3-4f28-9587-80c7302bd097 tempest-ImagesOneServerTestJSON-1174559401 tempest-ImagesOneServerTestJSON-1174559401-project-member] Lock "774358a1-c887-497e-b2d8-59a7c10e2329" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 619.414978] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bb146bb8-6c16-497f-804c-ba73c7d144b7 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] Acquiring lock "a7434fa2-563e-4f77-ba3e-40ad6bab0de3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 619.415353] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bb146bb8-6c16-497f-804c-ba73c7d144b7 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] Lock "a7434fa2-563e-4f77-ba3e-40ad6bab0de3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 621.295815] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e69cba99-3a03-4f76-b886-d9e0ce370610 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] Acquiring lock "fd4c9c56-1608-4390-8b41-736b3aa590ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 621.295815] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e69cba99-3a03-4f76-b886-d9e0ce370610 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] Lock "fd4c9c56-1608-4390-8b41-736b3aa590ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 623.670157] env[67169]: WARNING oslo_vmware.rw_handles [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 623.670157] env[67169]: ERROR oslo_vmware.rw_handles [ 623.670157] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/e7231a6a-b295-405c-a42d-6faa5d22e7b0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 623.671127] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 623.671127] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Copying Virtual Disk [datastore2] vmware_temp/e7231a6a-b295-405c-a42d-6faa5d22e7b0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/e7231a6a-b295-405c-a42d-6faa5d22e7b0/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 623.671643] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-529115a7-a3cc-42e2-90e3-c83f7eb5f527 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.684807] env[67169]: DEBUG oslo_vmware.api [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Waiting for the task: (returnval){ [ 623.684807] env[67169]: value = "task-2819076" [ 623.684807] env[67169]: _type = "Task" [ 623.684807] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 623.697837] env[67169]: DEBUG oslo_vmware.api [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Task: {'id': task-2819076, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 624.202892] env[67169]: DEBUG oslo_vmware.exceptions [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 624.203193] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 624.206839] env[67169]: ERROR nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 624.206839] env[67169]: Faults: ['InvalidArgument'] [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Traceback (most recent call last): [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] yield resources [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] self.driver.spawn(context, instance, image_meta, [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] self._vmops.spawn(context, instance, image_meta, injected_files, [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] self._fetch_image_if_missing(context, vi) [ 624.206839] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] image_cache(vi, tmp_image_ds_loc) [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] vm_util.copy_virtual_disk( [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] session._wait_for_task(vmdk_copy_task) [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] return self.wait_for_task(task_ref) [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] return evt.wait() [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] result = hub.switch() [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 624.207270] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] return self.greenlet.switch() [ 624.207660] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 624.207660] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] self.f(*self.args, **self.kw) [ 624.207660] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 624.207660] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] raise exceptions.translate_fault(task_info.error) [ 624.207660] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 624.207660] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Faults: ['InvalidArgument'] [ 624.207660] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] [ 624.207660] env[67169]: INFO nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Terminating instance [ 624.211165] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 624.211165] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 624.211165] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquiring lock "refresh_cache-958ac621-d0c8-4c04-8a58-11ad0f3cf678" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 624.211165] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquired lock "refresh_cache-958ac621-d0c8-4c04-8a58-11ad0f3cf678" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 624.212834] env[67169]: DEBUG nova.network.neutron [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 624.212834] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-73fe89ee-504a-4a27-abfb-3a0c5fae4072 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.229796] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 624.232171] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 624.237176] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c3a1ae6c-fca7-4067-9b6c-2d289468441c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.242423] env[67169]: DEBUG oslo_vmware.api [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Waiting for the task: (returnval){ [ 624.242423] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52aeddaf-869d-0a8f-e550-ac80073ea32e" [ 624.242423] env[67169]: _type = "Task" [ 624.242423] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 624.251565] env[67169]: DEBUG oslo_vmware.api [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52aeddaf-869d-0a8f-e550-ac80073ea32e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 624.288305] env[67169]: DEBUG nova.network.neutron [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 624.463344] env[67169]: DEBUG nova.network.neutron [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 624.480218] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Releasing lock "refresh_cache-958ac621-d0c8-4c04-8a58-11ad0f3cf678" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 624.480534] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 624.480739] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 624.481899] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-beae37fe-8109-4962-907c-d986769a232a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.494280] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 624.494280] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-499ec906-50d3-4829-b895-2b7e55d16e71 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.529504] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 624.529723] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 624.530760] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Deleting the datastore file [datastore2] 958ac621-d0c8-4c04-8a58-11ad0f3cf678 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 624.531145] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4e668c73-47c3-4d93-8c5f-9f91f4fcaaa6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.538503] env[67169]: DEBUG oslo_vmware.api [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Waiting for the task: (returnval){ [ 624.538503] env[67169]: value = "task-2819078" [ 624.538503] env[67169]: _type = "Task" [ 624.538503] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 624.550828] env[67169]: DEBUG oslo_vmware.api [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Task: {'id': task-2819078, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 624.752835] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 624.753121] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Creating directory with path [datastore2] vmware_temp/f788de02-74a0-47d4-a666-40a1ee7278b2/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 624.753350] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-000e4eac-482b-46af-9f52-4ba9337fe709 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.764162] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Created directory with path [datastore2] vmware_temp/f788de02-74a0-47d4-a666-40a1ee7278b2/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 624.764361] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Fetch image to [datastore2] vmware_temp/f788de02-74a0-47d4-a666-40a1ee7278b2/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 624.764529] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/f788de02-74a0-47d4-a666-40a1ee7278b2/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 624.765324] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dcf6dd0-6968-43e2-8c72-14143fbca463 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.772789] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7df58d44-985f-4eae-af82-a6d95232cf34 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.782229] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c7b3d36-01c6-4f7c-85cd-fa7482661d19 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.816163] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91253093-9bbf-42dd-a7e5-8d4923724eb3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.823343] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6a635a31-8a66-4f90-b8de-570cc654fe74 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 624.855202] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 624.938082] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f788de02-74a0-47d4-a666-40a1ee7278b2/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 625.006237] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 625.006430] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f788de02-74a0-47d4-a666-40a1ee7278b2/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 625.052356] env[67169]: DEBUG oslo_vmware.api [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Task: {'id': task-2819078, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.035724} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 625.052680] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 625.052868] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 625.053056] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 625.053235] env[67169]: INFO nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Took 0.57 seconds to destroy the instance on the hypervisor. [ 625.053471] env[67169]: DEBUG oslo.service.loopingcall [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 625.053674] env[67169]: DEBUG nova.compute.manager [-] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Skipping network deallocation for instance since networking was not requested. {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 625.058018] env[67169]: DEBUG nova.compute.claims [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 625.058018] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 625.058018] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 625.603347] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d76eba8-a003-4582-ab0a-ec6fe149936b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.616375] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa571ee6-2cb9-44e5-bbcf-19acd2345068 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.650329] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86bf223b-5ec6-40b4-8a56-708896b289b4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.658643] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22b76073-e14e-4e55-a003-b3d2b2547e99 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.675154] env[67169]: DEBUG nova.compute.provider_tree [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 625.691061] env[67169]: DEBUG nova.scheduler.client.report [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 625.719921] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.663s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 625.720426] env[67169]: ERROR nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 625.720426] env[67169]: Faults: ['InvalidArgument'] [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Traceback (most recent call last): [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] self.driver.spawn(context, instance, image_meta, [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] self._vmops.spawn(context, instance, image_meta, injected_files, [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] self._fetch_image_if_missing(context, vi) [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] image_cache(vi, tmp_image_ds_loc) [ 625.720426] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] vm_util.copy_virtual_disk( [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] session._wait_for_task(vmdk_copy_task) [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] return self.wait_for_task(task_ref) [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] return evt.wait() [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] result = hub.switch() [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] return self.greenlet.switch() [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 625.720999] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] self.f(*self.args, **self.kw) [ 625.722761] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 625.722761] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] raise exceptions.translate_fault(task_info.error) [ 625.722761] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 625.722761] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Faults: ['InvalidArgument'] [ 625.722761] env[67169]: ERROR nova.compute.manager [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] [ 625.722761] env[67169]: DEBUG nova.compute.utils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 625.725882] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Build of instance 958ac621-d0c8-4c04-8a58-11ad0f3cf678 was re-scheduled: A specified parameter was not correct: fileType [ 625.725882] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 625.726554] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 625.726694] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquiring lock "refresh_cache-958ac621-d0c8-4c04-8a58-11ad0f3cf678" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 625.726847] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Acquired lock "refresh_cache-958ac621-d0c8-4c04-8a58-11ad0f3cf678" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 625.727042] env[67169]: DEBUG nova.network.neutron [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 625.804781] env[67169]: DEBUG nova.network.neutron [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.233107] env[67169]: DEBUG nova.network.neutron [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.249303] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Releasing lock "refresh_cache-958ac621-d0c8-4c04-8a58-11ad0f3cf678" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 626.249616] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 626.249798] env[67169]: DEBUG nova.compute.manager [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] Skipping network deallocation for instance since networking was not requested. {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 626.393669] env[67169]: INFO nova.scheduler.client.report [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Deleted allocations for instance 958ac621-d0c8-4c04-8a58-11ad0f3cf678 [ 626.440812] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48298f54-4654-449a-bec3-0750de8541e2 tempest-ServersAdmin275Test-1422480121 tempest-ServersAdmin275Test-1422480121-project-member] Lock "958ac621-d0c8-4c04-8a58-11ad0f3cf678" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.782s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 626.441873] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "958ac621-d0c8-4c04-8a58-11ad0f3cf678" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 40.757s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 626.442077] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 958ac621-d0c8-4c04-8a58-11ad0f3cf678] During sync_power_state the instance has a pending task (spawning). Skip. [ 626.442253] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "958ac621-d0c8-4c04-8a58-11ad0f3cf678" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 626.475969] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 626.562938] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 626.563204] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 626.564680] env[67169]: INFO nova.compute.claims [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 627.117624] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9efe114-a0cb-4430-9aab-a7b38f3e089a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.131551] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf0e4271-9c8c-4e05-b73e-3c874e9c3560 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.173084] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6252dd27-8db5-4994-bd9d-c86af3564008 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.179947] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84d9b2fe-255d-438e-b7e2-fb0fcc8a1e5c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.194173] env[67169]: DEBUG nova.compute.provider_tree [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.214218] env[67169]: DEBUG nova.scheduler.client.report [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.236437] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.673s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 627.236975] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 627.288774] env[67169]: DEBUG nova.compute.utils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 627.290280] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 627.290280] env[67169]: DEBUG nova.network.neutron [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 627.305813] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 627.376233] env[67169]: DEBUG nova.policy [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e5381fb74f754009a655ce8d7406295f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02fc09aa28084858a7344b492278e6c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 627.391374] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 627.422989] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 627.423255] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 627.423412] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 627.423592] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 627.423734] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 627.423876] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 627.424097] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 627.424257] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 627.424421] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 627.424582] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 627.424788] env[67169]: DEBUG nova.virt.hardware [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 627.425625] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4f7498b-5335-4712-b6b5-5d4bdb9f3663 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.435188] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f0cfd81-bf6c-45af-9477-8bb7f757fd2e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.989295] env[67169]: DEBUG nova.network.neutron [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Successfully created port: bdcfb26c-d26b-4246-985b-75db349e1601 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 628.391600] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 628.391600] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 629.376842] env[67169]: DEBUG nova.network.neutron [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Successfully updated port: bdcfb26c-d26b-4246-985b-75db349e1601 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 629.391528] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquiring lock "refresh_cache-84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 629.391528] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquired lock "refresh_cache-84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 629.391717] env[67169]: DEBUG nova.network.neutron [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 629.458751] env[67169]: DEBUG nova.network.neutron [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.833923] env[67169]: DEBUG nova.network.neutron [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Updating instance_info_cache with network_info: [{"id": "bdcfb26c-d26b-4246-985b-75db349e1601", "address": "fa:16:3e:ca:73:ab", "network": {"id": "5dc16b5b-3197-4486-b761-b3a17d9477b2", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-120288819-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "02fc09aa28084858a7344b492278e6c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c66a277b-e3bf-43b8-a632-04fdd0720b91", "external-id": "nsx-vlan-transportzone-665", "segmentation_id": 665, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbdcfb26c-d2", "ovs_interfaceid": "bdcfb26c-d26b-4246-985b-75db349e1601", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.845885] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Releasing lock "refresh_cache-84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 629.846203] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Instance network_info: |[{"id": "bdcfb26c-d26b-4246-985b-75db349e1601", "address": "fa:16:3e:ca:73:ab", "network": {"id": "5dc16b5b-3197-4486-b761-b3a17d9477b2", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-120288819-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "02fc09aa28084858a7344b492278e6c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c66a277b-e3bf-43b8-a632-04fdd0720b91", "external-id": "nsx-vlan-transportzone-665", "segmentation_id": 665, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbdcfb26c-d2", "ovs_interfaceid": "bdcfb26c-d26b-4246-985b-75db349e1601", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 629.846596] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ca:73:ab', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c66a277b-e3bf-43b8-a632-04fdd0720b91', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bdcfb26c-d26b-4246-985b-75db349e1601', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 629.857532] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Creating folder: Project (02fc09aa28084858a7344b492278e6c3). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 629.860022] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-97b7ec3d-8abb-416e-ac01-36588ff4ffb5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.870314] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Created folder: Project (02fc09aa28084858a7344b492278e6c3) in parent group-v566843. [ 629.870591] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Creating folder: Instances. Parent ref: group-v566877. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 629.871087] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ff7e20bf-0242-419c-a5d0-bb80cc07b847 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.883706] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Created folder: Instances in parent group-v566877. [ 629.883706] env[67169]: DEBUG oslo.service.loopingcall [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 629.883968] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 629.884285] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a5312657-a438-4b05-9c1e-5137d231d4af {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.909650] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 629.909650] env[67169]: value = "task-2819085" [ 629.909650] env[67169]: _type = "Task" [ 629.909650] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 629.922489] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819085, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 630.083176] env[67169]: DEBUG nova.compute.manager [req-12c63896-0012-42a6-876d-9146ca637c19 req-dee6fe93-554f-47f1-b26c-7a358d8b31ce service nova] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Received event network-vif-plugged-bdcfb26c-d26b-4246-985b-75db349e1601 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 630.083699] env[67169]: DEBUG oslo_concurrency.lockutils [req-12c63896-0012-42a6-876d-9146ca637c19 req-dee6fe93-554f-47f1-b26c-7a358d8b31ce service nova] Acquiring lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 630.083932] env[67169]: DEBUG oslo_concurrency.lockutils [req-12c63896-0012-42a6-876d-9146ca637c19 req-dee6fe93-554f-47f1-b26c-7a358d8b31ce service nova] Lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 630.084130] env[67169]: DEBUG oslo_concurrency.lockutils [req-12c63896-0012-42a6-876d-9146ca637c19 req-dee6fe93-554f-47f1-b26c-7a358d8b31ce service nova] Lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 630.084305] env[67169]: DEBUG nova.compute.manager [req-12c63896-0012-42a6-876d-9146ca637c19 req-dee6fe93-554f-47f1-b26c-7a358d8b31ce service nova] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] No waiting events found dispatching network-vif-plugged-bdcfb26c-d26b-4246-985b-75db349e1601 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 630.085034] env[67169]: WARNING nova.compute.manager [req-12c63896-0012-42a6-876d-9146ca637c19 req-dee6fe93-554f-47f1-b26c-7a358d8b31ce service nova] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Received unexpected event network-vif-plugged-bdcfb26c-d26b-4246-985b-75db349e1601 for instance with vm_state building and task_state spawning. [ 630.424744] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819085, 'name': CreateVM_Task, 'duration_secs': 0.38639} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 630.425120] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 630.426944] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 630.427271] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 630.427765] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 630.428059] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d916e2b3-87ec-41e0-b422-f7fe07d13add {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.436518] env[67169]: DEBUG oslo_vmware.api [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Waiting for the task: (returnval){ [ 630.436518] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52d92d35-56b0-61f1-e12c-aa73922cfd3f" [ 630.436518] env[67169]: _type = "Task" [ 630.436518] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 630.448853] env[67169]: DEBUG oslo_vmware.api [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52d92d35-56b0-61f1-e12c-aa73922cfd3f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 630.946525] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 630.946874] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 630.947176] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 631.553879] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1866c171-8b5b-476e-9836-38ffb422c083 tempest-ServersTestJSON-1279073989 tempest-ServersTestJSON-1279073989-project-member] Acquiring lock "6b21ae30-8734-4c38-a8ae-c3fe03b6c36a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 631.554573] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1866c171-8b5b-476e-9836-38ffb422c083 tempest-ServersTestJSON-1279073989 tempest-ServersTestJSON-1279073989-project-member] Lock "6b21ae30-8734-4c38-a8ae-c3fe03b6c36a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 632.379101] env[67169]: DEBUG nova.compute.manager [req-5cbaa87a-06f2-4a14-846b-81a392e545a1 req-4b4e42ab-44fb-47dc-9d23-fe610dfb5e9f service nova] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Received event network-changed-bdcfb26c-d26b-4246-985b-75db349e1601 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 632.379101] env[67169]: DEBUG nova.compute.manager [req-5cbaa87a-06f2-4a14-846b-81a392e545a1 req-4b4e42ab-44fb-47dc-9d23-fe610dfb5e9f service nova] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Refreshing instance network info cache due to event network-changed-bdcfb26c-d26b-4246-985b-75db349e1601. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 632.379101] env[67169]: DEBUG oslo_concurrency.lockutils [req-5cbaa87a-06f2-4a14-846b-81a392e545a1 req-4b4e42ab-44fb-47dc-9d23-fe610dfb5e9f service nova] Acquiring lock "refresh_cache-84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 632.379101] env[67169]: DEBUG oslo_concurrency.lockutils [req-5cbaa87a-06f2-4a14-846b-81a392e545a1 req-4b4e42ab-44fb-47dc-9d23-fe610dfb5e9f service nova] Acquired lock "refresh_cache-84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 632.379371] env[67169]: DEBUG nova.network.neutron [req-5cbaa87a-06f2-4a14-846b-81a392e545a1 req-4b4e42ab-44fb-47dc-9d23-fe610dfb5e9f service nova] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Refreshing network info cache for port bdcfb26c-d26b-4246-985b-75db349e1601 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 633.167360] env[67169]: DEBUG nova.network.neutron [req-5cbaa87a-06f2-4a14-846b-81a392e545a1 req-4b4e42ab-44fb-47dc-9d23-fe610dfb5e9f service nova] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Updated VIF entry in instance network info cache for port bdcfb26c-d26b-4246-985b-75db349e1601. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 633.167360] env[67169]: DEBUG nova.network.neutron [req-5cbaa87a-06f2-4a14-846b-81a392e545a1 req-4b4e42ab-44fb-47dc-9d23-fe610dfb5e9f service nova] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Updating instance_info_cache with network_info: [{"id": "bdcfb26c-d26b-4246-985b-75db349e1601", "address": "fa:16:3e:ca:73:ab", "network": {"id": "5dc16b5b-3197-4486-b761-b3a17d9477b2", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-120288819-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "02fc09aa28084858a7344b492278e6c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c66a277b-e3bf-43b8-a632-04fdd0720b91", "external-id": "nsx-vlan-transportzone-665", "segmentation_id": 665, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbdcfb26c-d2", "ovs_interfaceid": "bdcfb26c-d26b-4246-985b-75db349e1601", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.180086] env[67169]: DEBUG oslo_concurrency.lockutils [req-5cbaa87a-06f2-4a14-846b-81a392e545a1 req-4b4e42ab-44fb-47dc-9d23-fe610dfb5e9f service nova] Releasing lock "refresh_cache-84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 636.239090] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e145ebc9-92d9-4148-9de6-53e5b64ff837 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] Acquiring lock "a578f813-807b-46bc-987f-5c9e9368c04b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 636.239425] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e145ebc9-92d9-4148-9de6-53e5b64ff837 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] Lock "a578f813-807b-46bc-987f-5c9e9368c04b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 638.543988] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bdba451-044d-4432-8f1d-46c6fa3dd51d tempest-ServersTestManualDisk-573119586 tempest-ServersTestManualDisk-573119586-project-member] Acquiring lock "f384f4a3-d34d-4d45-b063-79b25ea3c66c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 638.544322] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bdba451-044d-4432-8f1d-46c6fa3dd51d tempest-ServersTestManualDisk-573119586 tempest-ServersTestManualDisk-573119586-project-member] Lock "f384f4a3-d34d-4d45-b063-79b25ea3c66c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.631527] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.631527] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.659290] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.661760] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 643.661760] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 643.679980] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.680197] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.680312] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.680440] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.680567] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.680689] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 36781827-5846-49a4-8913-d98676af0b74] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.680810] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.680929] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.681059] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.681178] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 643.681293] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 643.681702] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.681871] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.682031] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.682183] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.682323] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.682464] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 643.682592] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 644.660022] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 644.672701] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.672948] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.673135] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 644.673300] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 644.674663] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ada0d18-8de7-40e5-9ec1-295de86f9fd0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.684116] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cf57e76-ae71-4e5a-bf61-a18c186d3d41 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.699303] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9c5e958-e123-4c77-bd01-1413dcf85476 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.706760] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3522d59d-4728-474b-a5d6-767a80c9c382 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.738318] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181016MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 644.738518] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.738707] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.877662] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.877825] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.877949] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ca7439e3-dbd5-4775-97e8-9927b325766a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.878348] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f5558a78-c91f-4c36-bb22-f94b1bd8cdbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.878348] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 11e90c91-26ca-4397-81a4-975a1d714d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.878465] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 36781827-5846-49a4-8913-d98676af0b74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.878501] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.878596] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 85978a3b-052a-4a05-84e6-75c723d49bd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.878713] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e2e52693-153a-43dd-b786-dd0758caabe2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.878830] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 644.906127] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 28552f70-695d-40cc-8dfa-bf40d6113220 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 644.942812] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 644.955239] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 644.968708] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3ac7cc70-7667-43b0-a3b8-0c791ef7ccd2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 644.980911] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 485cf92f-cf20-4f94-8a18-1a82501a829f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 644.994107] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a6427d1b-e915-4e3a-a4dd-6758fde2bc56 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.005857] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bee054d2-ac3c-47cc-a946-90bebf23f925 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.018854] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e7195659-b834-49e4-a9bd-6b2b7c7d4a20 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.030377] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f001b21d-59fb-4a9e-9c28-7c15892facfa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.047295] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a7165020-4b1d-44e5-83d9-53eafbef74e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.062585] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 45ce5bfe-4a85-4c26-914e-b85478fc45a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.077448] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 529ac98d-1e5c-4bcd-bb3d-7a7158e952cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.094208] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 774358a1-c887-497e-b2d8-59a7c10e2329 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.112306] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a7434fa2-563e-4f77-ba3e-40ad6bab0de3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.131624] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fd4c9c56-1608-4390-8b41-736b3aa590ed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.144628] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.162366] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6b21ae30-8734-4c38-a8ae-c3fe03b6c36a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.174297] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a578f813-807b-46bc-987f-5c9e9368c04b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.187026] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f384f4a3-d34d-4d45-b063-79b25ea3c66c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 645.187345] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 645.188940] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 645.571178] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dd47a3e-e919-45c0-9f45-b13dadfe88dd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.579459] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97832c20-7fec-4d88-95d6-59f5a49f5aa2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.610590] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-871e4fec-cb14-430e-8054-6dbb45547c69 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.620322] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d7d05d1-f3d3-4cae-b434-51785b930441 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.635952] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 645.647872] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 645.665848] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 645.666212] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.927s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 647.387410] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6aeff5e1-4ca3-4e66-9b16-082d97563389 tempest-ServerActionsV293TestJSON-1935842596 tempest-ServerActionsV293TestJSON-1935842596-project-member] Acquiring lock "0aa23be1-af16-4c0b-bfd0-4db5e927cfc4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 647.387806] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6aeff5e1-4ca3-4e66-9b16-082d97563389 tempest-ServerActionsV293TestJSON-1935842596 tempest-ServerActionsV293TestJSON-1935842596-project-member] Lock "0aa23be1-af16-4c0b-bfd0-4db5e927cfc4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 647.588264] env[67169]: DEBUG oslo_concurrency.lockutils [None req-179a1fab-b15e-400e-ab43-2abae1ed8c7d tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] Acquiring lock "a58a8711-b060-4036-ac43-897017f68d21" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 647.588540] env[67169]: DEBUG oslo_concurrency.lockutils [None req-179a1fab-b15e-400e-ab43-2abae1ed8c7d tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] Lock "a58a8711-b060-4036-ac43-897017f68d21" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 649.063834] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5efb4fdc-d030-4885-b530-ccb865c8d016 tempest-ServersAaction247Test-1711594299 tempest-ServersAaction247Test-1711594299-project-member] Acquiring lock "79d35dc5-e515-4f6f-9160-534d84f534bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 649.064259] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5efb4fdc-d030-4885-b530-ccb865c8d016 tempest-ServersAaction247Test-1711594299 tempest-ServersAaction247Test-1711594299-project-member] Lock "79d35dc5-e515-4f6f-9160-534d84f534bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 657.524798] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] Acquiring lock "db96dc14-cb08-4302-8aa3-81cf0c47fc73" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 657.525208] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] Lock "db96dc14-cb08-4302-8aa3-81cf0c47fc73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 657.550957] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] Acquiring lock "04d7477c-c3ff-42c7-9107-f54327c2f4b2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 657.551269] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] Lock "04d7477c-c3ff-42c7-9107-f54327c2f4b2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 657.578514] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] Acquiring lock "1fe4f2aa-0784-4356-aa4c-593666f22971" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 657.578757] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] Lock "1fe4f2aa-0784-4356-aa4c-593666f22971" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.365039] env[67169]: WARNING oslo_vmware.rw_handles [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 670.365039] env[67169]: ERROR oslo_vmware.rw_handles [ 670.365039] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/f788de02-74a0-47d4-a666-40a1ee7278b2/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 670.366657] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 670.366905] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Copying Virtual Disk [datastore2] vmware_temp/f788de02-74a0-47d4-a666-40a1ee7278b2/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/f788de02-74a0-47d4-a666-40a1ee7278b2/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 670.367213] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5d36c807-e7f6-4fc2-a019-e8c6846b693e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.375676] env[67169]: DEBUG oslo_vmware.api [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Waiting for the task: (returnval){ [ 670.375676] env[67169]: value = "task-2819093" [ 670.375676] env[67169]: _type = "Task" [ 670.375676] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 670.384131] env[67169]: DEBUG oslo_vmware.api [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Task: {'id': task-2819093, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 670.886046] env[67169]: DEBUG oslo_vmware.exceptions [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 670.886353] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 670.886913] env[67169]: ERROR nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 670.886913] env[67169]: Faults: ['InvalidArgument'] [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Traceback (most recent call last): [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] yield resources [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] self.driver.spawn(context, instance, image_meta, [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] self._vmops.spawn(context, instance, image_meta, injected_files, [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] self._fetch_image_if_missing(context, vi) [ 670.886913] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] image_cache(vi, tmp_image_ds_loc) [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] vm_util.copy_virtual_disk( [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] session._wait_for_task(vmdk_copy_task) [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] return self.wait_for_task(task_ref) [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] return evt.wait() [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] result = hub.switch() [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 670.887217] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] return self.greenlet.switch() [ 670.887528] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 670.887528] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] self.f(*self.args, **self.kw) [ 670.887528] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 670.887528] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] raise exceptions.translate_fault(task_info.error) [ 670.887528] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 670.887528] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Faults: ['InvalidArgument'] [ 670.887528] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] [ 670.887528] env[67169]: INFO nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Terminating instance [ 670.888778] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 670.888979] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 670.889616] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 670.889807] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 670.890042] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4aa9fba6-1374-4174-90b2-73f0e6c64947 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.892369] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c7af318-f8c0-48ad-9df1-cf6d064be77f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.899215] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 670.899436] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-94a268b8-1810-4b54-94ad-6e39a4d9fe66 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.901673] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 670.901868] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 670.902805] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-76731296-ff7f-4a94-98f0-c742c8e1e6af {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.907185] env[67169]: DEBUG oslo_vmware.api [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for the task: (returnval){ [ 670.907185] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]523610c1-6750-8668-f9f5-ef4dad259d80" [ 670.907185] env[67169]: _type = "Task" [ 670.907185] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 670.915578] env[67169]: DEBUG oslo_vmware.api [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]523610c1-6750-8668-f9f5-ef4dad259d80, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 670.974210] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 670.974210] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 670.974210] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Deleting the datastore file [datastore2] 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 670.974210] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3d6d9b52-f920-4e98-b849-10fca9ce0502 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.980691] env[67169]: DEBUG oslo_vmware.api [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Waiting for the task: (returnval){ [ 670.980691] env[67169]: value = "task-2819095" [ 670.980691] env[67169]: _type = "Task" [ 670.980691] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 670.988680] env[67169]: DEBUG oslo_vmware.api [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Task: {'id': task-2819095, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 671.418431] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 671.418733] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Creating directory with path [datastore2] vmware_temp/da9602a4-bc3c-4535-b299-596d3863899b/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 671.418942] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4c7bfe76-1520-4fc8-aff1-35c34e3ba0b4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.430772] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Created directory with path [datastore2] vmware_temp/da9602a4-bc3c-4535-b299-596d3863899b/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 671.430901] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Fetch image to [datastore2] vmware_temp/da9602a4-bc3c-4535-b299-596d3863899b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 671.431161] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/da9602a4-bc3c-4535-b299-596d3863899b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 671.431824] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8fa269d-f12f-419d-98dc-2a70c1ab1e23 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.442037] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f30f049-1cec-4738-964b-e767fb054e1b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.452518] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df78d51e-6df1-4f33-a309-db0589c5dba3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.487703] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65d7fa49-12f0-4a4f-9484-85f482089989 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.495328] env[67169]: DEBUG oslo_vmware.api [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Task: {'id': task-2819095, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068235} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 671.496956] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 671.497179] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 671.497388] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 671.497575] env[67169]: INFO nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Took 0.61 seconds to destroy the instance on the hypervisor. [ 671.499396] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0a5b1968-1b00-4d3a-b733-e0d895339c88 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.501397] env[67169]: DEBUG nova.compute.claims [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 671.501580] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 671.501791] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 671.523660] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 671.588274] env[67169]: DEBUG oslo_vmware.rw_handles [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/da9602a4-bc3c-4535-b299-596d3863899b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 671.650762] env[67169]: DEBUG oslo_vmware.rw_handles [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 671.650973] env[67169]: DEBUG oslo_vmware.rw_handles [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/da9602a4-bc3c-4535-b299-596d3863899b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 672.060820] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d469ede2-2a2d-44e0-bc53-631ab2e282bc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.069129] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a2c6659-908a-4270-9d64-971427423a46 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.107516] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8331da8b-e290-42c5-b6c4-575d45d7d9ff {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.117588] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41e58156-1afb-4978-95e4-cc34d5c7b8e2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.139598] env[67169]: DEBUG nova.compute.provider_tree [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 672.151629] env[67169]: DEBUG nova.scheduler.client.report [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 672.169408] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.667s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 672.170255] env[67169]: ERROR nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 672.170255] env[67169]: Faults: ['InvalidArgument'] [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Traceback (most recent call last): [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] self.driver.spawn(context, instance, image_meta, [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] self._vmops.spawn(context, instance, image_meta, injected_files, [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] self._fetch_image_if_missing(context, vi) [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] image_cache(vi, tmp_image_ds_loc) [ 672.170255] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] vm_util.copy_virtual_disk( [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] session._wait_for_task(vmdk_copy_task) [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] return self.wait_for_task(task_ref) [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] return evt.wait() [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] result = hub.switch() [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] return self.greenlet.switch() [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 672.170630] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] self.f(*self.args, **self.kw) [ 672.170951] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 672.170951] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] raise exceptions.translate_fault(task_info.error) [ 672.170951] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 672.170951] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Faults: ['InvalidArgument'] [ 672.170951] env[67169]: ERROR nova.compute.manager [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] [ 672.171329] env[67169]: DEBUG nova.compute.utils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 672.173530] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Build of instance 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403 was re-scheduled: A specified parameter was not correct: fileType [ 672.173530] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 672.174110] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 672.174380] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 672.174638] env[67169]: DEBUG nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 672.174896] env[67169]: DEBUG nova.network.neutron [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 673.053205] env[67169]: DEBUG nova.network.neutron [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.066649] env[67169]: INFO nova.compute.manager [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] Took 0.89 seconds to deallocate network for instance. [ 673.166254] env[67169]: INFO nova.scheduler.client.report [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Deleted allocations for instance 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403 [ 673.185031] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e04032d-0bdc-43df-a263-7859fab3c8e2 tempest-ServersAdminNegativeTestJSON-1443014733 tempest-ServersAdminNegativeTestJSON-1443014733-project-member] Lock "9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 102.752s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.186139] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 87.501s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 673.186396] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9e0a990e-d9ad-4dae-9e2d-6d1f7d999403] During sync_power_state the instance has a pending task (spawning). Skip. [ 673.186633] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "9e0a990e-d9ad-4dae-9e2d-6d1f7d999403" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.214751] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 673.269070] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 673.269324] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 673.271210] env[67169]: INFO nova.compute.claims [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 673.747695] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f80adaa3-2746-418e-85e4-02f1351682f6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.755720] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47ba0f19-f0b2-4220-9f0c-64913068dbeb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.786657] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8061f5c-1d4b-48ce-96b7-5db4b73dc6b5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.794136] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1c23381-d6b1-42f0-b131-2f88e98b15d4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.807949] env[67169]: DEBUG nova.compute.provider_tree [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 673.815835] env[67169]: DEBUG nova.scheduler.client.report [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 673.828787] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.559s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.829257] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 673.863441] env[67169]: DEBUG nova.compute.utils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 673.865045] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 673.865195] env[67169]: DEBUG nova.network.neutron [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 673.873113] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 673.943132] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 673.966855] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 673.967335] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 673.967796] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 673.972018] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 673.972018] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 673.972018] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 673.972018] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 673.972018] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 673.972271] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 673.972271] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 673.972271] env[67169]: DEBUG nova.virt.hardware [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 673.972271] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e82ecb6e-d67b-4315-90b9-dfe7e692a4fe {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.975581] env[67169]: DEBUG nova.policy [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd87129deed454f1b9f30f9045eefec74', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78e2544636b346d692a70b7776f49dd4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 673.982077] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7babb05b-b13f-44e7-9958-f2f5488f2239 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 674.475431] env[67169]: DEBUG nova.network.neutron [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Successfully created port: 1ff7911b-67be-4679-b95d-34c16239e684 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 675.687130] env[67169]: DEBUG nova.network.neutron [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Successfully updated port: 1ff7911b-67be-4679-b95d-34c16239e684 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 675.696399] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquiring lock "refresh_cache-28552f70-695d-40cc-8dfa-bf40d6113220" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 675.696788] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquired lock "refresh_cache-28552f70-695d-40cc-8dfa-bf40d6113220" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 675.696788] env[67169]: DEBUG nova.network.neutron [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 675.767533] env[67169]: DEBUG nova.network.neutron [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 675.966016] env[67169]: DEBUG nova.compute.manager [req-4c44cd99-772d-4257-adc1-250257f10859 req-1b6d837b-0eb8-4850-9fbc-86c847786400 service nova] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Received event network-vif-plugged-1ff7911b-67be-4679-b95d-34c16239e684 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 675.966307] env[67169]: DEBUG oslo_concurrency.lockutils [req-4c44cd99-772d-4257-adc1-250257f10859 req-1b6d837b-0eb8-4850-9fbc-86c847786400 service nova] Acquiring lock "28552f70-695d-40cc-8dfa-bf40d6113220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 675.966520] env[67169]: DEBUG oslo_concurrency.lockutils [req-4c44cd99-772d-4257-adc1-250257f10859 req-1b6d837b-0eb8-4850-9fbc-86c847786400 service nova] Lock "28552f70-695d-40cc-8dfa-bf40d6113220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 675.966696] env[67169]: DEBUG oslo_concurrency.lockutils [req-4c44cd99-772d-4257-adc1-250257f10859 req-1b6d837b-0eb8-4850-9fbc-86c847786400 service nova] Lock "28552f70-695d-40cc-8dfa-bf40d6113220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 675.966867] env[67169]: DEBUG nova.compute.manager [req-4c44cd99-772d-4257-adc1-250257f10859 req-1b6d837b-0eb8-4850-9fbc-86c847786400 service nova] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] No waiting events found dispatching network-vif-plugged-1ff7911b-67be-4679-b95d-34c16239e684 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 675.967042] env[67169]: WARNING nova.compute.manager [req-4c44cd99-772d-4257-adc1-250257f10859 req-1b6d837b-0eb8-4850-9fbc-86c847786400 service nova] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Received unexpected event network-vif-plugged-1ff7911b-67be-4679-b95d-34c16239e684 for instance with vm_state building and task_state spawning. [ 676.033641] env[67169]: DEBUG nova.network.neutron [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Updating instance_info_cache with network_info: [{"id": "1ff7911b-67be-4679-b95d-34c16239e684", "address": "fa:16:3e:c0:77:20", "network": {"id": "3432cf0a-ba76-49de-9aa8-d9fd5ca9d05f", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1617616486-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "78e2544636b346d692a70b7776f49dd4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c42bb08a-77b4-4bba-8166-702cbb1b5f1e", "external-id": "nsx-vlan-transportzone-137", "segmentation_id": 137, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ff7911b-67", "ovs_interfaceid": "1ff7911b-67be-4679-b95d-34c16239e684", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 676.044840] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Releasing lock "refresh_cache-28552f70-695d-40cc-8dfa-bf40d6113220" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 676.045194] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Instance network_info: |[{"id": "1ff7911b-67be-4679-b95d-34c16239e684", "address": "fa:16:3e:c0:77:20", "network": {"id": "3432cf0a-ba76-49de-9aa8-d9fd5ca9d05f", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1617616486-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "78e2544636b346d692a70b7776f49dd4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c42bb08a-77b4-4bba-8166-702cbb1b5f1e", "external-id": "nsx-vlan-transportzone-137", "segmentation_id": 137, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ff7911b-67", "ovs_interfaceid": "1ff7911b-67be-4679-b95d-34c16239e684", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 676.045620] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c0:77:20', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c42bb08a-77b4-4bba-8166-702cbb1b5f1e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1ff7911b-67be-4679-b95d-34c16239e684', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 676.053345] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Creating folder: Project (78e2544636b346d692a70b7776f49dd4). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 676.053894] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9abbf4ba-56b6-4acf-9372-0ebc1a1882d8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.071023] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Created folder: Project (78e2544636b346d692a70b7776f49dd4) in parent group-v566843. [ 676.071128] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Creating folder: Instances. Parent ref: group-v566881. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 676.071365] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5df6150a-a1e6-4f94-b2aa-75cba41fbeda {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.080269] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Created folder: Instances in parent group-v566881. [ 676.080570] env[67169]: DEBUG oslo.service.loopingcall [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 676.080759] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 676.080962] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e6d04032-a3ce-45cd-9275-f4c13bd72270 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.099600] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 676.099600] env[67169]: value = "task-2819098" [ 676.099600] env[67169]: _type = "Task" [ 676.099600] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 676.106888] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819098, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 676.610180] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819098, 'name': CreateVM_Task, 'duration_secs': 0.328621} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 676.610357] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 676.611111] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 676.611210] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 676.611535] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 676.611796] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-246c2514-b009-4cd9-8f9e-f24b46f32b96 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.616442] env[67169]: DEBUG oslo_vmware.api [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Waiting for the task: (returnval){ [ 676.616442] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52dfd97b-51cc-ea2b-fac0-6f69f31719e6" [ 676.616442] env[67169]: _type = "Task" [ 676.616442] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 676.624298] env[67169]: DEBUG oslo_vmware.api [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52dfd97b-51cc-ea2b-fac0-6f69f31719e6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 677.126864] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 677.127087] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 677.127193] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 678.307738] env[67169]: DEBUG nova.compute.manager [req-cf48799b-f89c-4e49-a7af-4f8a83053ae1 req-e149fe4c-5fb0-47f2-93d2-c0ddd9e45ee8 service nova] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Received event network-changed-1ff7911b-67be-4679-b95d-34c16239e684 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 678.307957] env[67169]: DEBUG nova.compute.manager [req-cf48799b-f89c-4e49-a7af-4f8a83053ae1 req-e149fe4c-5fb0-47f2-93d2-c0ddd9e45ee8 service nova] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Refreshing instance network info cache due to event network-changed-1ff7911b-67be-4679-b95d-34c16239e684. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 678.308179] env[67169]: DEBUG oslo_concurrency.lockutils [req-cf48799b-f89c-4e49-a7af-4f8a83053ae1 req-e149fe4c-5fb0-47f2-93d2-c0ddd9e45ee8 service nova] Acquiring lock "refresh_cache-28552f70-695d-40cc-8dfa-bf40d6113220" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 678.308324] env[67169]: DEBUG oslo_concurrency.lockutils [req-cf48799b-f89c-4e49-a7af-4f8a83053ae1 req-e149fe4c-5fb0-47f2-93d2-c0ddd9e45ee8 service nova] Acquired lock "refresh_cache-28552f70-695d-40cc-8dfa-bf40d6113220" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 678.308668] env[67169]: DEBUG nova.network.neutron [req-cf48799b-f89c-4e49-a7af-4f8a83053ae1 req-e149fe4c-5fb0-47f2-93d2-c0ddd9e45ee8 service nova] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Refreshing network info cache for port 1ff7911b-67be-4679-b95d-34c16239e684 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 678.731566] env[67169]: DEBUG nova.network.neutron [req-cf48799b-f89c-4e49-a7af-4f8a83053ae1 req-e149fe4c-5fb0-47f2-93d2-c0ddd9e45ee8 service nova] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Updated VIF entry in instance network info cache for port 1ff7911b-67be-4679-b95d-34c16239e684. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 678.731939] env[67169]: DEBUG nova.network.neutron [req-cf48799b-f89c-4e49-a7af-4f8a83053ae1 req-e149fe4c-5fb0-47f2-93d2-c0ddd9e45ee8 service nova] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Updating instance_info_cache with network_info: [{"id": "1ff7911b-67be-4679-b95d-34c16239e684", "address": "fa:16:3e:c0:77:20", "network": {"id": "3432cf0a-ba76-49de-9aa8-d9fd5ca9d05f", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1617616486-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "78e2544636b346d692a70b7776f49dd4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c42bb08a-77b4-4bba-8166-702cbb1b5f1e", "external-id": "nsx-vlan-transportzone-137", "segmentation_id": 137, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ff7911b-67", "ovs_interfaceid": "1ff7911b-67be-4679-b95d-34c16239e684", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 678.741268] env[67169]: DEBUG oslo_concurrency.lockutils [req-cf48799b-f89c-4e49-a7af-4f8a83053ae1 req-e149fe4c-5fb0-47f2-93d2-c0ddd9e45ee8 service nova] Releasing lock "refresh_cache-28552f70-695d-40cc-8dfa-bf40d6113220" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 683.046317] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquiring lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 683.046652] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 703.666274] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 703.666582] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 703.666695] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 703.690263] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690263] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690263] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690263] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690263] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 36781827-5846-49a4-8913-d98676af0b74] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690487] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690487] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690487] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690487] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690487] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 703.690618] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 703.690618] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 703.690618] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 703.690825] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 704.659200] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 704.659472] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 704.659621] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 704.659771] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 704.671756] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 704.671992] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 704.672151] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 704.672311] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 704.673407] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-101fbb10-846f-4ead-9df7-a91c6ee37090 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.682326] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-878b8f5a-190f-4edb-aa91-2a5c11f8b197 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.696276] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08777e03-0a8b-4fa0-b44a-d206c6e49e7a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.703056] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d566e056-9b6a-461c-8cd7-f51c17b9d41c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.731686] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181035MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 704.731843] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 704.732050] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 704.804801] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.804972] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ca7439e3-dbd5-4775-97e8-9927b325766a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.805119] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f5558a78-c91f-4c36-bb22-f94b1bd8cdbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.805243] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 11e90c91-26ca-4397-81a4-975a1d714d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.805360] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 36781827-5846-49a4-8913-d98676af0b74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.805502] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.805600] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 85978a3b-052a-4a05-84e6-75c723d49bd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.805694] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e2e52693-153a-43dd-b786-dd0758caabe2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.805805] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.805918] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 28552f70-695d-40cc-8dfa-bf40d6113220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 704.817188] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.828319] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.837918] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3ac7cc70-7667-43b0-a3b8-0c791ef7ccd2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.847790] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 485cf92f-cf20-4f94-8a18-1a82501a829f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.857627] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a6427d1b-e915-4e3a-a4dd-6758fde2bc56 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.868398] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bee054d2-ac3c-47cc-a946-90bebf23f925 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.877739] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e7195659-b834-49e4-a9bd-6b2b7c7d4a20 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.887795] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f001b21d-59fb-4a9e-9c28-7c15892facfa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.896533] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a7165020-4b1d-44e5-83d9-53eafbef74e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.906143] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 45ce5bfe-4a85-4c26-914e-b85478fc45a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.915091] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 529ac98d-1e5c-4bcd-bb3d-7a7158e952cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.924213] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 774358a1-c887-497e-b2d8-59a7c10e2329 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.933121] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a7434fa2-563e-4f77-ba3e-40ad6bab0de3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.942368] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fd4c9c56-1608-4390-8b41-736b3aa590ed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.952129] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.962702] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6b21ae30-8734-4c38-a8ae-c3fe03b6c36a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.972421] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a578f813-807b-46bc-987f-5c9e9368c04b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.983741] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f384f4a3-d34d-4d45-b063-79b25ea3c66c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 704.991919] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0aa23be1-af16-4c0b-bfd0-4db5e927cfc4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 705.001318] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a58a8711-b060-4036-ac43-897017f68d21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 705.011592] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 79d35dc5-e515-4f6f-9160-534d84f534bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 705.021081] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance db96dc14-cb08-4302-8aa3-81cf0c47fc73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 705.030890] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d7477c-c3ff-42c7-9107-f54327c2f4b2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 705.041228] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1fe4f2aa-0784-4356-aa4c-593666f22971 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 705.050812] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 705.051072] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 705.051224] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 705.433275] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc1d5581-ef56-4ff3-8fa1-5bb21c47117b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.441178] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-985f8396-91dd-465e-ad68-fd2f35a37276 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.472187] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-590062c2-a258-4777-a1e8-78bd6ab97282 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.478188] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b2cddde-a0ef-4613-af28-60cccaca0ebf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.491134] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 705.499473] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 705.515257] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 705.515439] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.783s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 706.515510] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 706.515782] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 721.632944] env[67169]: WARNING oslo_vmware.rw_handles [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 721.632944] env[67169]: ERROR oslo_vmware.rw_handles [ 721.633460] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/da9602a4-bc3c-4535-b299-596d3863899b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 721.634949] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 721.635344] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Copying Virtual Disk [datastore2] vmware_temp/da9602a4-bc3c-4535-b299-596d3863899b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/da9602a4-bc3c-4535-b299-596d3863899b/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 721.635668] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f0ab352a-31e1-45c9-b838-02f8a7950e20 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.643759] env[67169]: DEBUG oslo_vmware.api [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for the task: (returnval){ [ 721.643759] env[67169]: value = "task-2819099" [ 721.643759] env[67169]: _type = "Task" [ 721.643759] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 721.651995] env[67169]: DEBUG oslo_vmware.api [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Task: {'id': task-2819099, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 722.153305] env[67169]: DEBUG oslo_vmware.exceptions [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 722.153600] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 722.154168] env[67169]: ERROR nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 722.154168] env[67169]: Faults: ['InvalidArgument'] [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Traceback (most recent call last): [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] yield resources [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] self.driver.spawn(context, instance, image_meta, [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] self._fetch_image_if_missing(context, vi) [ 722.154168] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] image_cache(vi, tmp_image_ds_loc) [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] vm_util.copy_virtual_disk( [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] session._wait_for_task(vmdk_copy_task) [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] return self.wait_for_task(task_ref) [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] return evt.wait() [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] result = hub.switch() [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 722.154521] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] return self.greenlet.switch() [ 722.154902] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 722.154902] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] self.f(*self.args, **self.kw) [ 722.154902] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 722.154902] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] raise exceptions.translate_fault(task_info.error) [ 722.154902] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 722.154902] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Faults: ['InvalidArgument'] [ 722.154902] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] [ 722.154902] env[67169]: INFO nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Terminating instance [ 722.155962] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 722.156184] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 722.156418] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb64c3e8-f22d-46e2-b32c-147950b6f28e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.159381] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 722.159586] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 722.160360] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee45e6ec-6524-4c8e-ae9a-402203b2a8b1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.167929] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 722.168966] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-304b16bc-5ee7-4869-b5b0-3cb63227420c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.170463] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 722.170638] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 722.171303] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bd4d6bca-232f-4184-be87-a6802405c382 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.176798] env[67169]: DEBUG oslo_vmware.api [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Waiting for the task: (returnval){ [ 722.176798] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52fdffba-f3c0-7abf-1be6-b0e1eae21d75" [ 722.176798] env[67169]: _type = "Task" [ 722.176798] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 722.186755] env[67169]: DEBUG oslo_vmware.api [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52fdffba-f3c0-7abf-1be6-b0e1eae21d75, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 722.236850] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 722.238083] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 722.238083] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Deleting the datastore file [datastore2] ca7439e3-dbd5-4775-97e8-9927b325766a {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 722.238083] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9bf69e2e-51eb-4e1e-bd08-0b88e3d0efd2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.244170] env[67169]: DEBUG oslo_vmware.api [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for the task: (returnval){ [ 722.244170] env[67169]: value = "task-2819101" [ 722.244170] env[67169]: _type = "Task" [ 722.244170] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 722.251973] env[67169]: DEBUG oslo_vmware.api [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Task: {'id': task-2819101, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 722.686813] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 722.688183] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Creating directory with path [datastore2] vmware_temp/0079c678-12d5-4cee-9c44-fd98559443ad/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 722.688476] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-016a84a4-32b4-4b67-b352-5f2ca5cf9746 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.715162] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Created directory with path [datastore2] vmware_temp/0079c678-12d5-4cee-9c44-fd98559443ad/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 722.715396] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Fetch image to [datastore2] vmware_temp/0079c678-12d5-4cee-9c44-fd98559443ad/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 722.715594] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/0079c678-12d5-4cee-9c44-fd98559443ad/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 722.717306] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00811771-76f6-4747-9e90-94cd5ee90373 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.724172] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eb1ba62-c5d7-478a-82bc-85de3bae20e5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.733425] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ace0c90-26ce-44d1-9b7f-3bac7d123e7b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.770679] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75989926-4c32-43e8-962e-dd9ea70dd3c0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.780811] env[67169]: DEBUG oslo_vmware.api [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Task: {'id': task-2819101, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074653} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 722.780811] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 722.780811] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 722.780811] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 722.780811] env[67169]: INFO nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Took 0.62 seconds to destroy the instance on the hypervisor. [ 722.782699] env[67169]: DEBUG nova.compute.claims [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 722.782699] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 722.782817] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 722.786058] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e596af31-2ce2-4c74-9fcb-bc510f04f988 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.807255] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 722.865494] env[67169]: DEBUG oslo_vmware.rw_handles [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0079c678-12d5-4cee-9c44-fd98559443ad/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 722.929071] env[67169]: DEBUG oslo_vmware.rw_handles [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 722.929535] env[67169]: DEBUG oslo_vmware.rw_handles [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0079c678-12d5-4cee-9c44-fd98559443ad/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 723.257664] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0265e554-91f8-4f3e-888e-6bf4cc8c6b07 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.265784] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db5bac73-a8c8-4c05-beea-a79e7bbd1eaf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.296440] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eae46a6-52e9-4930-a0cc-b0e73c3d1fc0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.303773] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e39b3c8-ea1c-44ef-aa26-86f8cbbc3e8b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.317350] env[67169]: DEBUG nova.compute.provider_tree [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 723.327869] env[67169]: DEBUG nova.scheduler.client.report [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 723.341528] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.559s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.342096] env[67169]: ERROR nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 723.342096] env[67169]: Faults: ['InvalidArgument'] [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Traceback (most recent call last): [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] self.driver.spawn(context, instance, image_meta, [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] self._fetch_image_if_missing(context, vi) [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] image_cache(vi, tmp_image_ds_loc) [ 723.342096] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] vm_util.copy_virtual_disk( [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] session._wait_for_task(vmdk_copy_task) [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] return self.wait_for_task(task_ref) [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] return evt.wait() [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] result = hub.switch() [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] return self.greenlet.switch() [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 723.342475] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] self.f(*self.args, **self.kw) [ 723.342795] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 723.342795] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] raise exceptions.translate_fault(task_info.error) [ 723.342795] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 723.342795] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Faults: ['InvalidArgument'] [ 723.342795] env[67169]: ERROR nova.compute.manager [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] [ 723.342947] env[67169]: DEBUG nova.compute.utils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 723.345472] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Build of instance ca7439e3-dbd5-4775-97e8-9927b325766a was re-scheduled: A specified parameter was not correct: fileType [ 723.345472] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 723.345884] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 723.346074] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 723.346248] env[67169]: DEBUG nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 723.346445] env[67169]: DEBUG nova.network.neutron [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 723.823600] env[67169]: DEBUG nova.network.neutron [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.833507] env[67169]: INFO nova.compute.manager [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] Took 0.49 seconds to deallocate network for instance. [ 723.944537] env[67169]: INFO nova.scheduler.client.report [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Deleted allocations for instance ca7439e3-dbd5-4775-97e8-9927b325766a [ 723.979237] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e29f758d-dc96-4350-90de-a6a527eb3483 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "ca7439e3-dbd5-4775-97e8-9927b325766a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 150.575s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.982527] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "ca7439e3-dbd5-4775-97e8-9927b325766a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 138.294s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 723.982527] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ca7439e3-dbd5-4775-97e8-9927b325766a] During sync_power_state the instance has a pending task (spawning). Skip. [ 723.982527] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "ca7439e3-dbd5-4775-97e8-9927b325766a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.988079] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 724.042432] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 724.042699] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 724.044185] env[67169]: INFO nova.compute.claims [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 724.477129] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b15331bc-16b2-447e-8865-2a24ca5311ec {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.485113] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e628a458-1562-436e-877f-69ca0f84c28a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.518128] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a97b38e5-56af-4036-955d-603747d93ce3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.525461] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04f44b99-41c6-4016-927d-d7ea4400b446 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.538826] env[67169]: DEBUG nova.compute.provider_tree [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 724.549049] env[67169]: DEBUG nova.scheduler.client.report [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 724.564973] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.522s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 724.565800] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 724.602108] env[67169]: DEBUG nova.compute.utils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 724.603337] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 724.603506] env[67169]: DEBUG nova.network.neutron [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 724.618959] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 724.700110] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 724.713376] env[67169]: DEBUG nova.policy [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3d7eca63214542a9b150147973d73e83', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1b72ff0e2f243318ffd986becce62fb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 724.725372] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 724.726063] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 724.726063] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 724.726063] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 724.726251] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 724.726251] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 724.726448] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 724.726608] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 724.726773] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 724.726937] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 724.727125] env[67169]: DEBUG nova.virt.hardware [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 724.728024] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7986d56-8342-417d-8368-a91997dfe3dc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.736361] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3aced9d-54e1-4963-a39e-de32f45bca7c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 725.484606] env[67169]: DEBUG nova.network.neutron [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Successfully created port: 7a5b0499-aaee-4f8f-924f-0ef9893c2ff7 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 726.499361] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "1f0f1960-0c77-4e72-86ee-807819e75d2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 726.499361] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 726.570925] env[67169]: DEBUG nova.network.neutron [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Successfully updated port: 7a5b0499-aaee-4f8f-924f-0ef9893c2ff7 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 726.585312] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquiring lock "refresh_cache-7a42aeb9-0518-448d-a3a6-8e68d6497922" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 726.585635] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquired lock "refresh_cache-7a42aeb9-0518-448d-a3a6-8e68d6497922" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 726.585794] env[67169]: DEBUG nova.network.neutron [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 726.672596] env[67169]: DEBUG nova.network.neutron [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 726.887170] env[67169]: DEBUG nova.compute.manager [req-c3685e2d-08f9-4c06-a0ee-f28bc21ae395 req-c902f10f-f9dc-4694-87e9-0bd0819cc91b service nova] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Received event network-vif-plugged-7a5b0499-aaee-4f8f-924f-0ef9893c2ff7 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 726.887390] env[67169]: DEBUG oslo_concurrency.lockutils [req-c3685e2d-08f9-4c06-a0ee-f28bc21ae395 req-c902f10f-f9dc-4694-87e9-0bd0819cc91b service nova] Acquiring lock "7a42aeb9-0518-448d-a3a6-8e68d6497922-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 726.887589] env[67169]: DEBUG oslo_concurrency.lockutils [req-c3685e2d-08f9-4c06-a0ee-f28bc21ae395 req-c902f10f-f9dc-4694-87e9-0bd0819cc91b service nova] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 726.887755] env[67169]: DEBUG oslo_concurrency.lockutils [req-c3685e2d-08f9-4c06-a0ee-f28bc21ae395 req-c902f10f-f9dc-4694-87e9-0bd0819cc91b service nova] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 726.888030] env[67169]: DEBUG nova.compute.manager [req-c3685e2d-08f9-4c06-a0ee-f28bc21ae395 req-c902f10f-f9dc-4694-87e9-0bd0819cc91b service nova] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] No waiting events found dispatching network-vif-plugged-7a5b0499-aaee-4f8f-924f-0ef9893c2ff7 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 726.888147] env[67169]: WARNING nova.compute.manager [req-c3685e2d-08f9-4c06-a0ee-f28bc21ae395 req-c902f10f-f9dc-4694-87e9-0bd0819cc91b service nova] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Received unexpected event network-vif-plugged-7a5b0499-aaee-4f8f-924f-0ef9893c2ff7 for instance with vm_state building and task_state spawning. [ 726.961067] env[67169]: DEBUG nova.network.neutron [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Updating instance_info_cache with network_info: [{"id": "7a5b0499-aaee-4f8f-924f-0ef9893c2ff7", "address": "fa:16:3e:ee:a5:d8", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.207", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7a5b0499-aa", "ovs_interfaceid": "7a5b0499-aaee-4f8f-924f-0ef9893c2ff7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.976660] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Releasing lock "refresh_cache-7a42aeb9-0518-448d-a3a6-8e68d6497922" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 726.976956] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Instance network_info: |[{"id": "7a5b0499-aaee-4f8f-924f-0ef9893c2ff7", "address": "fa:16:3e:ee:a5:d8", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.207", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7a5b0499-aa", "ovs_interfaceid": "7a5b0499-aaee-4f8f-924f-0ef9893c2ff7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 726.977369] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ee:a5:d8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c9bc2632-36f9-4912-8782-8bbb789f909d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7a5b0499-aaee-4f8f-924f-0ef9893c2ff7', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 726.985066] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Creating folder: Project (b1b72ff0e2f243318ffd986becce62fb). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 726.985574] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4005a7ea-57c5-44bb-8bb0-f84bb6e96c0b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 726.997324] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Created folder: Project (b1b72ff0e2f243318ffd986becce62fb) in parent group-v566843. [ 726.997524] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Creating folder: Instances. Parent ref: group-v566884. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 726.997739] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-41c64026-c5e0-4f54-94ec-40e69a347542 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.006243] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Created folder: Instances in parent group-v566884. [ 727.006487] env[67169]: DEBUG oslo.service.loopingcall [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 727.006668] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 727.006864] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b7d29258-7efb-4180-ab33-7265324cc0f4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.026259] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 727.026259] env[67169]: value = "task-2819104" [ 727.026259] env[67169]: _type = "Task" [ 727.026259] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 727.033851] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819104, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 727.536484] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819104, 'name': CreateVM_Task, 'duration_secs': 0.279709} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 727.536839] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 727.537330] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 727.537493] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 727.537962] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 727.538352] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4410d29d-3de9-42cf-a947-357aad0a450b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.542766] env[67169]: DEBUG oslo_vmware.api [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Waiting for the task: (returnval){ [ 727.542766] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]525fe9be-16e0-c0d4-8667-28ea43b93a35" [ 727.542766] env[67169]: _type = "Task" [ 727.542766] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 727.552929] env[67169]: DEBUG oslo_vmware.api [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]525fe9be-16e0-c0d4-8667-28ea43b93a35, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 728.056756] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 728.058121] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 728.058121] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 728.950844] env[67169]: DEBUG nova.compute.manager [req-c6b42fce-9dc6-4e3d-a1ab-e214f003e503 req-4e32e0ab-4b76-47d8-a117-782bc3088782 service nova] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Received event network-changed-7a5b0499-aaee-4f8f-924f-0ef9893c2ff7 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 728.951121] env[67169]: DEBUG nova.compute.manager [req-c6b42fce-9dc6-4e3d-a1ab-e214f003e503 req-4e32e0ab-4b76-47d8-a117-782bc3088782 service nova] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Refreshing instance network info cache due to event network-changed-7a5b0499-aaee-4f8f-924f-0ef9893c2ff7. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 728.951346] env[67169]: DEBUG oslo_concurrency.lockutils [req-c6b42fce-9dc6-4e3d-a1ab-e214f003e503 req-4e32e0ab-4b76-47d8-a117-782bc3088782 service nova] Acquiring lock "refresh_cache-7a42aeb9-0518-448d-a3a6-8e68d6497922" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 728.951498] env[67169]: DEBUG oslo_concurrency.lockutils [req-c6b42fce-9dc6-4e3d-a1ab-e214f003e503 req-4e32e0ab-4b76-47d8-a117-782bc3088782 service nova] Acquired lock "refresh_cache-7a42aeb9-0518-448d-a3a6-8e68d6497922" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 728.951660] env[67169]: DEBUG nova.network.neutron [req-c6b42fce-9dc6-4e3d-a1ab-e214f003e503 req-4e32e0ab-4b76-47d8-a117-782bc3088782 service nova] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Refreshing network info cache for port 7a5b0499-aaee-4f8f-924f-0ef9893c2ff7 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 729.383053] env[67169]: DEBUG nova.network.neutron [req-c6b42fce-9dc6-4e3d-a1ab-e214f003e503 req-4e32e0ab-4b76-47d8-a117-782bc3088782 service nova] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Updated VIF entry in instance network info cache for port 7a5b0499-aaee-4f8f-924f-0ef9893c2ff7. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 729.383420] env[67169]: DEBUG nova.network.neutron [req-c6b42fce-9dc6-4e3d-a1ab-e214f003e503 req-4e32e0ab-4b76-47d8-a117-782bc3088782 service nova] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Updating instance_info_cache with network_info: [{"id": "7a5b0499-aaee-4f8f-924f-0ef9893c2ff7", "address": "fa:16:3e:ee:a5:d8", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.207", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7a5b0499-aa", "ovs_interfaceid": "7a5b0499-aaee-4f8f-924f-0ef9893c2ff7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.392860] env[67169]: DEBUG oslo_concurrency.lockutils [req-c6b42fce-9dc6-4e3d-a1ab-e214f003e503 req-4e32e0ab-4b76-47d8-a117-782bc3088782 service nova] Releasing lock "refresh_cache-7a42aeb9-0518-448d-a3a6-8e68d6497922" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 762.659065] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 764.654133] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 764.658764] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 764.659420] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 764.659420] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 764.679757] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.679909] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.680087] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.680266] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 36781827-5846-49a4-8913-d98676af0b74] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.680404] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.680548] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.680674] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.680797] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.680916] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.681044] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 764.681335] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 764.681610] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 764.681852] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 764.692640] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 764.692855] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 764.693035] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 764.693185] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 764.695369] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0969968f-1db8-4872-8888-a15fcc40db18 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.705384] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcf4bf70-acdb-4b04-a1e3-a325132efd30 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.719281] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a2d47ee-4878-4e17-86ea-98ae192a38ab {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.725653] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3665105-008c-42a6-891a-f662509efc1d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.754655] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181028MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 764.754834] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 764.755049] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 764.832387] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.832620] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f5558a78-c91f-4c36-bb22-f94b1bd8cdbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.832761] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 11e90c91-26ca-4397-81a4-975a1d714d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.832887] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 36781827-5846-49a4-8913-d98676af0b74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.833016] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.833141] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 85978a3b-052a-4a05-84e6-75c723d49bd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.833259] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e2e52693-153a-43dd-b786-dd0758caabe2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.833375] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.833489] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 28552f70-695d-40cc-8dfa-bf40d6113220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.833601] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 764.844545] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.855759] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3ac7cc70-7667-43b0-a3b8-0c791ef7ccd2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.866308] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 485cf92f-cf20-4f94-8a18-1a82501a829f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.876408] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a6427d1b-e915-4e3a-a4dd-6758fde2bc56 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.885941] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bee054d2-ac3c-47cc-a946-90bebf23f925 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.895662] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e7195659-b834-49e4-a9bd-6b2b7c7d4a20 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.907177] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f001b21d-59fb-4a9e-9c28-7c15892facfa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.917772] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a7165020-4b1d-44e5-83d9-53eafbef74e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.928370] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 45ce5bfe-4a85-4c26-914e-b85478fc45a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.938165] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 529ac98d-1e5c-4bcd-bb3d-7a7158e952cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.948342] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 774358a1-c887-497e-b2d8-59a7c10e2329 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.959246] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a7434fa2-563e-4f77-ba3e-40ad6bab0de3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.971463] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fd4c9c56-1608-4390-8b41-736b3aa590ed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.981931] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 764.992096] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6b21ae30-8734-4c38-a8ae-c3fe03b6c36a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.001710] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a578f813-807b-46bc-987f-5c9e9368c04b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.010694] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f384f4a3-d34d-4d45-b063-79b25ea3c66c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.020183] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0aa23be1-af16-4c0b-bfd0-4db5e927cfc4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.029850] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a58a8711-b060-4036-ac43-897017f68d21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.040380] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 79d35dc5-e515-4f6f-9160-534d84f534bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.050117] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance db96dc14-cb08-4302-8aa3-81cf0c47fc73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.059605] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d7477c-c3ff-42c7-9107-f54327c2f4b2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.069571] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1fe4f2aa-0784-4356-aa4c-593666f22971 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.079074] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.088452] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 765.088686] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 765.088831] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 765.477782] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ce0f570-9eb2-4884-b93a-30b34fb09c1d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.485686] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22eda58e-b0c5-467c-b015-f63570585703 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.515333] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1334f1c6-7e40-442f-96ef-b0e859edfa22 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.522308] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf897294-636f-44af-be4c-1aff2a50e333 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.535742] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 765.544735] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 765.560902] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 765.562027] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.806s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 766.538629] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 766.538906] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 766.539018] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 766.659817] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 767.653803] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 767.675360] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 770.303093] env[67169]: DEBUG oslo_concurrency.lockutils [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquiring lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 771.790031] env[67169]: WARNING oslo_vmware.rw_handles [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 771.790031] env[67169]: ERROR oslo_vmware.rw_handles [ 771.790693] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/0079c678-12d5-4cee-9c44-fd98559443ad/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 771.791992] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 771.792262] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Copying Virtual Disk [datastore2] vmware_temp/0079c678-12d5-4cee-9c44-fd98559443ad/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/0079c678-12d5-4cee-9c44-fd98559443ad/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 771.792542] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-764ac5bd-5cf1-4e32-99c5-60a0714b3476 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 771.801073] env[67169]: DEBUG oslo_vmware.api [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Waiting for the task: (returnval){ [ 771.801073] env[67169]: value = "task-2819105" [ 771.801073] env[67169]: _type = "Task" [ 771.801073] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 771.810549] env[67169]: DEBUG oslo_vmware.api [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Task: {'id': task-2819105, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 772.311749] env[67169]: DEBUG oslo_vmware.exceptions [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 772.312043] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 772.312594] env[67169]: ERROR nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 772.312594] env[67169]: Faults: ['InvalidArgument'] [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Traceback (most recent call last): [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] yield resources [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] self.driver.spawn(context, instance, image_meta, [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] self._vmops.spawn(context, instance, image_meta, injected_files, [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] self._fetch_image_if_missing(context, vi) [ 772.312594] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] image_cache(vi, tmp_image_ds_loc) [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] vm_util.copy_virtual_disk( [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] session._wait_for_task(vmdk_copy_task) [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] return self.wait_for_task(task_ref) [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] return evt.wait() [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] result = hub.switch() [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 772.312918] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] return self.greenlet.switch() [ 772.313445] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 772.313445] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] self.f(*self.args, **self.kw) [ 772.313445] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 772.313445] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] raise exceptions.translate_fault(task_info.error) [ 772.313445] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 772.313445] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Faults: ['InvalidArgument'] [ 772.313445] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] [ 772.313445] env[67169]: INFO nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Terminating instance [ 772.314417] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 772.314622] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 772.315234] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 772.315421] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 772.315657] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-53c70b6d-15b0-4b8e-8338-ccb7c6d30025 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.318128] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dfcbee8-7a64-49a1-8b78-ec2cc7232231 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.324664] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 772.325019] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d3387b49-1109-4504-b9d6-17a7104ed0de {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.327454] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 772.327627] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 772.328608] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d5c7bc0-49d7-402c-b39e-d97f041d0c7a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.334091] env[67169]: DEBUG oslo_vmware.api [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Waiting for the task: (returnval){ [ 772.334091] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52106d5d-f6b5-2473-6e8d-7dd0b7294e44" [ 772.334091] env[67169]: _type = "Task" [ 772.334091] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 772.340833] env[67169]: DEBUG oslo_vmware.api [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52106d5d-f6b5-2473-6e8d-7dd0b7294e44, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 772.395021] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 772.395021] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 772.395021] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Deleting the datastore file [datastore2] 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 772.395021] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-06afcaa7-6613-49fd-bab6-c3075e94d8b1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.401836] env[67169]: DEBUG oslo_vmware.api [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Waiting for the task: (returnval){ [ 772.401836] env[67169]: value = "task-2819107" [ 772.401836] env[67169]: _type = "Task" [ 772.401836] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 772.409743] env[67169]: DEBUG oslo_vmware.api [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Task: {'id': task-2819107, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 772.845166] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 772.845470] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Creating directory with path [datastore2] vmware_temp/261bfea3-4c62-4f23-90f2-88a59351dfa6/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 772.845661] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a1a95124-6528-4ccb-b280-ed0c156e5726 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.857493] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Created directory with path [datastore2] vmware_temp/261bfea3-4c62-4f23-90f2-88a59351dfa6/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 772.857493] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Fetch image to [datastore2] vmware_temp/261bfea3-4c62-4f23-90f2-88a59351dfa6/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 772.857493] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/261bfea3-4c62-4f23-90f2-88a59351dfa6/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 772.857767] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dce2708-71a8-4a7f-8486-69d3c9b941c0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.864593] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0c4eaad-a37f-4a9c-9b8e-16e8ee2f1335 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.873909] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34494fe4-7903-4ef3-86fe-07f62c607d66 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.908843] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-744aec9a-3958-4ff0-8f7e-609c241de375 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.918487] env[67169]: DEBUG oslo_vmware.api [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Task: {'id': task-2819107, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081071} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 772.919850] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 772.920047] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 772.920223] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 772.920392] env[67169]: INFO nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Took 0.60 seconds to destroy the instance on the hypervisor. [ 772.922489] env[67169]: DEBUG nova.compute.claims [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 772.922615] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.922801] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 772.925199] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5d0bebdb-42f1-412a-aa7a-480a6f5182d4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.947672] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 773.001582] env[67169]: DEBUG oslo_vmware.rw_handles [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/261bfea3-4c62-4f23-90f2-88a59351dfa6/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 773.059854] env[67169]: DEBUG oslo_vmware.rw_handles [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 773.060059] env[67169]: DEBUG oslo_vmware.rw_handles [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/261bfea3-4c62-4f23-90f2-88a59351dfa6/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 773.446087] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a008c8f1-bda3-4fe9-a49c-9b7fae755378 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.453171] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3457d429-179b-4de7-8610-fd7fdadb70e2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.483813] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e14a5d2-d84c-4431-9d8b-c24355c09672 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.490916] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f922217-ce86-48d3-a6d8-8abdb1aa5dcd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.503616] env[67169]: DEBUG nova.compute.provider_tree [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 773.512940] env[67169]: DEBUG nova.scheduler.client.report [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 773.526746] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.604s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.527315] env[67169]: ERROR nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 773.527315] env[67169]: Faults: ['InvalidArgument'] [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Traceback (most recent call last): [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] self.driver.spawn(context, instance, image_meta, [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] self._vmops.spawn(context, instance, image_meta, injected_files, [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] self._fetch_image_if_missing(context, vi) [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] image_cache(vi, tmp_image_ds_loc) [ 773.527315] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] vm_util.copy_virtual_disk( [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] session._wait_for_task(vmdk_copy_task) [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] return self.wait_for_task(task_ref) [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] return evt.wait() [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] result = hub.switch() [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] return self.greenlet.switch() [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 773.527703] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] self.f(*self.args, **self.kw) [ 773.528091] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 773.528091] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] raise exceptions.translate_fault(task_info.error) [ 773.528091] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 773.528091] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Faults: ['InvalidArgument'] [ 773.528091] env[67169]: ERROR nova.compute.manager [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] [ 773.528091] env[67169]: DEBUG nova.compute.utils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 773.529488] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Build of instance 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03 was re-scheduled: A specified parameter was not correct: fileType [ 773.529488] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 773.530180] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 773.530180] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 773.530180] env[67169]: DEBUG nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 773.530336] env[67169]: DEBUG nova.network.neutron [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 774.242610] env[67169]: DEBUG nova.network.neutron [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 774.252980] env[67169]: INFO nova.compute.manager [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Took 0.72 seconds to deallocate network for instance. [ 774.354130] env[67169]: INFO nova.scheduler.client.report [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Deleted allocations for instance 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03 [ 774.375162] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f1a10853-5422-40c7-bd45-9fe494c76ecf tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.500s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.376208] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 188.691s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 774.376427] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] During sync_power_state the instance has a pending task (spawning). Skip. [ 774.376633] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.377267] env[67169]: DEBUG oslo_concurrency.lockutils [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 4.074s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 774.377734] env[67169]: DEBUG oslo_concurrency.lockutils [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Acquiring lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 774.377958] env[67169]: DEBUG oslo_concurrency.lockutils [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 774.378156] env[67169]: DEBUG oslo_concurrency.lockutils [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.380321] env[67169]: INFO nova.compute.manager [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Terminating instance [ 774.383224] env[67169]: DEBUG nova.compute.manager [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 774.387096] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 774.387096] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-be6bef0d-77f2-4156-8ce8-d11b78c99109 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.393984] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6e3ffa4-9c51-4a6e-9562-5c55d4430f9f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.407121] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 774.426082] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03 could not be found. [ 774.426303] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 774.426682] env[67169]: INFO nova.compute.manager [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Took 0.04 seconds to destroy the instance on the hypervisor. [ 774.426798] env[67169]: DEBUG oslo.service.loopingcall [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 774.426926] env[67169]: DEBUG nova.compute.manager [-] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 774.427049] env[67169]: DEBUG nova.network.neutron [-] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 774.460472] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 774.461163] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 774.462207] env[67169]: INFO nova.compute.claims [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 774.465631] env[67169]: DEBUG nova.network.neutron [-] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 774.478701] env[67169]: INFO nova.compute.manager [-] [instance: 68cc2368-ac6a-4003-9c19-5f2a4e9b0e03] Took 0.05 seconds to deallocate network for instance. [ 774.610074] env[67169]: DEBUG oslo_concurrency.lockutils [None req-288abad3-213e-4a91-a6e3-09f27715b7a1 tempest-FloatingIPsAssociationTestJSON-908037617 tempest-FloatingIPsAssociationTestJSON-908037617-project-member] Lock "68cc2368-ac6a-4003-9c19-5f2a4e9b0e03" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.233s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.955917] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-417461b1-d64a-45d3-b70f-6c54c9fcf21a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.964016] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3a0b99e-c646-4588-8954-5c23c7d0199f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.993880] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13ae0b9c-e75e-4b4d-932f-0aff24ca5e0b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.001444] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ca64dc6-1925-4e7a-9f3e-052bb6bee9aa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.015349] env[67169]: DEBUG nova.compute.provider_tree [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 775.023789] env[67169]: DEBUG nova.scheduler.client.report [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 775.038095] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.577s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 775.038612] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 775.079837] env[67169]: DEBUG nova.compute.utils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 775.085017] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 775.085017] env[67169]: DEBUG nova.network.neutron [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 775.089888] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 775.161733] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 775.189332] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:57:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='286678736',id=22,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-1859963827',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 775.189578] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 775.189743] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 775.189928] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 775.190093] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 775.190248] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 775.190454] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 775.190614] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 775.190858] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 775.191031] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 775.191207] env[67169]: DEBUG nova.virt.hardware [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 775.192084] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-775f0f90-2d8f-4a58-a92b-9b462775ba00 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.201094] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58c70754-c29a-4a3f-8d9d-48701a7aeee0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.239957] env[67169]: DEBUG nova.policy [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2b36cc99568a473fbc3baf038bb2d566', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '95a5d9d9d3ce46fb9bd77ea2088f9996', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 776.314986] env[67169]: DEBUG nova.network.neutron [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Successfully created port: 71d81983-9dc3-48d3-b8ff-f04616edf8d3 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 777.819171] env[67169]: DEBUG nova.network.neutron [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Successfully updated port: 71d81983-9dc3-48d3-b8ff-f04616edf8d3 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 777.839043] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquiring lock "refresh_cache-43b73a7c-eda8-4239-885f-d4fb8fa6f28a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 777.840751] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquired lock "refresh_cache-43b73a7c-eda8-4239-885f-d4fb8fa6f28a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 777.840911] env[67169]: DEBUG nova.network.neutron [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 777.925208] env[67169]: DEBUG nova.network.neutron [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 778.267440] env[67169]: DEBUG nova.network.neutron [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Updating instance_info_cache with network_info: [{"id": "71d81983-9dc3-48d3-b8ff-f04616edf8d3", "address": "fa:16:3e:30:78:10", "network": {"id": "df6ab664-39bf-44c1-891e-2d9d196af87c", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1591976157-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "95a5d9d9d3ce46fb9bd77ea2088f9996", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ab93899c-92b2-4d84-95a6-192234add28c", "external-id": "nsx-vlan-transportzone-697", "segmentation_id": 697, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap71d81983-9d", "ovs_interfaceid": "71d81983-9dc3-48d3-b8ff-f04616edf8d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 778.286199] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Releasing lock "refresh_cache-43b73a7c-eda8-4239-885f-d4fb8fa6f28a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 778.286275] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Instance network_info: |[{"id": "71d81983-9dc3-48d3-b8ff-f04616edf8d3", "address": "fa:16:3e:30:78:10", "network": {"id": "df6ab664-39bf-44c1-891e-2d9d196af87c", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1591976157-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "95a5d9d9d3ce46fb9bd77ea2088f9996", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ab93899c-92b2-4d84-95a6-192234add28c", "external-id": "nsx-vlan-transportzone-697", "segmentation_id": 697, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap71d81983-9d", "ovs_interfaceid": "71d81983-9dc3-48d3-b8ff-f04616edf8d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 778.286275] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:30:78:10', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ab93899c-92b2-4d84-95a6-192234add28c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '71d81983-9dc3-48d3-b8ff-f04616edf8d3', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 778.293329] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Creating folder: Project (95a5d9d9d3ce46fb9bd77ea2088f9996). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 778.294326] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3956d3b0-7974-46f3-9481-5aaccf8639b7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.305761] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Created folder: Project (95a5d9d9d3ce46fb9bd77ea2088f9996) in parent group-v566843. [ 778.306459] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Creating folder: Instances. Parent ref: group-v566887. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 778.306459] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ad2bd04-207a-4e32-bcf0-9bb47a1c47bd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.315653] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Created folder: Instances in parent group-v566887. [ 778.315970] env[67169]: DEBUG oslo.service.loopingcall [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 778.316217] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 778.316473] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0c1b4f20-0ba5-4e2f-ba4d-8958f5aa8bae {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.338625] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 778.338625] env[67169]: value = "task-2819110" [ 778.338625] env[67169]: _type = "Task" [ 778.338625] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 778.346457] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819110, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 778.406763] env[67169]: DEBUG nova.compute.manager [req-a4ae93c1-5c08-4bd0-9e3d-ec260ab71fae req-b61219e5-4497-4404-9244-608c10cd7ac2 service nova] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Received event network-vif-plugged-71d81983-9dc3-48d3-b8ff-f04616edf8d3 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 778.406954] env[67169]: DEBUG oslo_concurrency.lockutils [req-a4ae93c1-5c08-4bd0-9e3d-ec260ab71fae req-b61219e5-4497-4404-9244-608c10cd7ac2 service nova] Acquiring lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 778.407906] env[67169]: DEBUG oslo_concurrency.lockutils [req-a4ae93c1-5c08-4bd0-9e3d-ec260ab71fae req-b61219e5-4497-4404-9244-608c10cd7ac2 service nova] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 778.407906] env[67169]: DEBUG oslo_concurrency.lockutils [req-a4ae93c1-5c08-4bd0-9e3d-ec260ab71fae req-b61219e5-4497-4404-9244-608c10cd7ac2 service nova] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 778.407906] env[67169]: DEBUG nova.compute.manager [req-a4ae93c1-5c08-4bd0-9e3d-ec260ab71fae req-b61219e5-4497-4404-9244-608c10cd7ac2 service nova] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] No waiting events found dispatching network-vif-plugged-71d81983-9dc3-48d3-b8ff-f04616edf8d3 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 778.407906] env[67169]: WARNING nova.compute.manager [req-a4ae93c1-5c08-4bd0-9e3d-ec260ab71fae req-b61219e5-4497-4404-9244-608c10cd7ac2 service nova] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Received unexpected event network-vif-plugged-71d81983-9dc3-48d3-b8ff-f04616edf8d3 for instance with vm_state building and task_state spawning. [ 778.634427] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquiring lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 778.848614] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819110, 'name': CreateVM_Task, 'duration_secs': 0.28402} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 778.848896] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 778.849428] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 778.849591] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 778.850010] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 778.850272] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ad453f06-1fe8-4a7d-b93d-387513791802 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.854941] env[67169]: DEBUG oslo_vmware.api [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Waiting for the task: (returnval){ [ 778.854941] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ba5b62-dbcf-0e10-3cf0-17fd27cbcd91" [ 778.854941] env[67169]: _type = "Task" [ 778.854941] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 778.862422] env[67169]: DEBUG oslo_vmware.api [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ba5b62-dbcf-0e10-3cf0-17fd27cbcd91, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 779.367566] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 779.368107] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 779.368574] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 781.139541] env[67169]: DEBUG nova.compute.manager [req-4d56c41f-8f6e-41ec-a552-63959c74bcc2 req-1246497f-03ad-4f0b-bfe4-4794be46f6a1 service nova] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Received event network-changed-71d81983-9dc3-48d3-b8ff-f04616edf8d3 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 781.139811] env[67169]: DEBUG nova.compute.manager [req-4d56c41f-8f6e-41ec-a552-63959c74bcc2 req-1246497f-03ad-4f0b-bfe4-4794be46f6a1 service nova] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Refreshing instance network info cache due to event network-changed-71d81983-9dc3-48d3-b8ff-f04616edf8d3. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 781.140043] env[67169]: DEBUG oslo_concurrency.lockutils [req-4d56c41f-8f6e-41ec-a552-63959c74bcc2 req-1246497f-03ad-4f0b-bfe4-4794be46f6a1 service nova] Acquiring lock "refresh_cache-43b73a7c-eda8-4239-885f-d4fb8fa6f28a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 781.140115] env[67169]: DEBUG oslo_concurrency.lockutils [req-4d56c41f-8f6e-41ec-a552-63959c74bcc2 req-1246497f-03ad-4f0b-bfe4-4794be46f6a1 service nova] Acquired lock "refresh_cache-43b73a7c-eda8-4239-885f-d4fb8fa6f28a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 781.140300] env[67169]: DEBUG nova.network.neutron [req-4d56c41f-8f6e-41ec-a552-63959c74bcc2 req-1246497f-03ad-4f0b-bfe4-4794be46f6a1 service nova] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Refreshing network info cache for port 71d81983-9dc3-48d3-b8ff-f04616edf8d3 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 781.396312] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquiring lock "11e90c91-26ca-4397-81a4-975a1d714d19" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.790950] env[67169]: DEBUG nova.network.neutron [req-4d56c41f-8f6e-41ec-a552-63959c74bcc2 req-1246497f-03ad-4f0b-bfe4-4794be46f6a1 service nova] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Updated VIF entry in instance network info cache for port 71d81983-9dc3-48d3-b8ff-f04616edf8d3. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 781.791321] env[67169]: DEBUG nova.network.neutron [req-4d56c41f-8f6e-41ec-a552-63959c74bcc2 req-1246497f-03ad-4f0b-bfe4-4794be46f6a1 service nova] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Updating instance_info_cache with network_info: [{"id": "71d81983-9dc3-48d3-b8ff-f04616edf8d3", "address": "fa:16:3e:30:78:10", "network": {"id": "df6ab664-39bf-44c1-891e-2d9d196af87c", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1591976157-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "95a5d9d9d3ce46fb9bd77ea2088f9996", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ab93899c-92b2-4d84-95a6-192234add28c", "external-id": "nsx-vlan-transportzone-697", "segmentation_id": 697, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap71d81983-9d", "ovs_interfaceid": "71d81983-9dc3-48d3-b8ff-f04616edf8d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 781.801423] env[67169]: DEBUG oslo_concurrency.lockutils [req-4d56c41f-8f6e-41ec-a552-63959c74bcc2 req-1246497f-03ad-4f0b-bfe4-4794be46f6a1 service nova] Releasing lock "refresh_cache-43b73a7c-eda8-4239-885f-d4fb8fa6f28a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 782.723409] env[67169]: DEBUG oslo_concurrency.lockutils [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquiring lock "36781827-5846-49a4-8913-d98676af0b74" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 782.766490] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquiring lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.875806] env[67169]: DEBUG oslo_concurrency.lockutils [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "85978a3b-052a-4a05-84e6-75c723d49bd8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 785.828401] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 785.828826] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 793.361982] env[67169]: DEBUG oslo_concurrency.lockutils [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "e2e52693-153a-43dd-b786-dd0758caabe2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 797.376484] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquiring lock "28552f70-695d-40cc-8dfa-bf40d6113220" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 797.832130] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquiring lock "7a42aeb9-0518-448d-a3a6-8e68d6497922" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 797.984205] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquiring lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 798.894717] env[67169]: DEBUG oslo_concurrency.lockutils [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquiring lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 806.950966] env[67169]: DEBUG oslo_concurrency.lockutils [None req-460b6927-69b3-4804-a4cd-b807c665f771 tempest-TenantUsagesTestJSON-1528703859 tempest-TenantUsagesTestJSON-1528703859-project-member] Acquiring lock "41ebddaf-e07d-4925-b9da-758b8e83f545" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 806.951426] env[67169]: DEBUG oslo_concurrency.lockutils [None req-460b6927-69b3-4804-a4cd-b807c665f771 tempest-TenantUsagesTestJSON-1528703859 tempest-TenantUsagesTestJSON-1528703859-project-member] Lock "41ebddaf-e07d-4925-b9da-758b8e83f545" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 809.685515] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2a59dd09-c1a1-4dc5-8341-628a27b9a7ff tempest-ServerMetadataTestJSON-895650985 tempest-ServerMetadataTestJSON-895650985-project-member] Acquiring lock "3c5bc03a-acc3-4601-8155-2cab101be865" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 809.685829] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2a59dd09-c1a1-4dc5-8341-628a27b9a7ff tempest-ServerMetadataTestJSON-895650985 tempest-ServerMetadataTestJSON-895650985-project-member] Lock "3c5bc03a-acc3-4601-8155-2cab101be865" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 810.387821] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f501c074-65dc-493f-a6db-8ace51085315 tempest-ServerMetadataNegativeTestJSON-1861165825 tempest-ServerMetadataNegativeTestJSON-1861165825-project-member] Acquiring lock "3ea7c620-5903-410d-8ca0-68789a5e5194" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.388080] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f501c074-65dc-493f-a6db-8ace51085315 tempest-ServerMetadataNegativeTestJSON-1861165825 tempest-ServerMetadataNegativeTestJSON-1861165825-project-member] Lock "3ea7c620-5903-410d-8ca0-68789a5e5194" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 810.486952] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c5227c3b-bf91-43b9-b202-c55d927f7c0d tempest-ServerShowV254Test-307983608 tempest-ServerShowV254Test-307983608-project-member] Acquiring lock "d0a7fdac-3e41-4539-bef4-0442bc5ad674" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.487220] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c5227c3b-bf91-43b9-b202-c55d927f7c0d tempest-ServerShowV254Test-307983608 tempest-ServerShowV254Test-307983608-project-member] Lock "d0a7fdac-3e41-4539-bef4-0442bc5ad674" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 815.641868] env[67169]: DEBUG oslo_concurrency.lockutils [None req-444db087-1355-4d7a-9f19-a97c8aa302b0 tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] Acquiring lock "2aa47f31-4da8-4ef6-b28c-fe2a03bf8906" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 815.642259] env[67169]: DEBUG oslo_concurrency.lockutils [None req-444db087-1355-4d7a-9f19-a97c8aa302b0 tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] Lock "2aa47f31-4da8-4ef6-b28c-fe2a03bf8906" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 821.669569] env[67169]: WARNING oslo_vmware.rw_handles [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 821.669569] env[67169]: ERROR oslo_vmware.rw_handles [ 821.670162] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/261bfea3-4c62-4f23-90f2-88a59351dfa6/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 821.671766] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 821.672034] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Copying Virtual Disk [datastore2] vmware_temp/261bfea3-4c62-4f23-90f2-88a59351dfa6/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/261bfea3-4c62-4f23-90f2-88a59351dfa6/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 821.673224] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ec30cb4a-289e-4840-a4b0-483891d6cc43 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 821.680468] env[67169]: DEBUG oslo_vmware.api [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Waiting for the task: (returnval){ [ 821.680468] env[67169]: value = "task-2819111" [ 821.680468] env[67169]: _type = "Task" [ 821.680468] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 821.689994] env[67169]: DEBUG oslo_vmware.api [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Task: {'id': task-2819111, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 822.190879] env[67169]: DEBUG oslo_vmware.exceptions [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 822.191224] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 822.191830] env[67169]: ERROR nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 822.191830] env[67169]: Faults: ['InvalidArgument'] [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Traceback (most recent call last): [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] yield resources [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] self.driver.spawn(context, instance, image_meta, [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] self._fetch_image_if_missing(context, vi) [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] image_cache(vi, tmp_image_ds_loc) [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] vm_util.copy_virtual_disk( [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] session._wait_for_task(vmdk_copy_task) [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] return self.wait_for_task(task_ref) [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] return evt.wait() [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] result = hub.switch() [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] return self.greenlet.switch() [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] self.f(*self.args, **self.kw) [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] raise exceptions.translate_fault(task_info.error) [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Faults: ['InvalidArgument'] [ 822.191830] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] [ 822.192730] env[67169]: INFO nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Terminating instance [ 822.194367] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 822.194507] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 822.195177] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 822.195409] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 822.195669] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f5446ce0-3dd5-4b6c-b54a-414dfa264c39 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.198062] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5d5f57d-c38e-4871-80f0-55db93f9be89 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.205288] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 822.205570] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-41659ac9-bcea-4ac8-9a7d-68a9d8177ba6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.207746] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 822.207968] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 822.208952] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e59edf52-75fd-475a-9d84-babd425b88db {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.213776] env[67169]: DEBUG oslo_vmware.api [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Waiting for the task: (returnval){ [ 822.213776] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]523bba6a-0fb3-e32f-7f66-03d0b62a26e8" [ 822.213776] env[67169]: _type = "Task" [ 822.213776] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 822.221225] env[67169]: DEBUG oslo_vmware.api [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]523bba6a-0fb3-e32f-7f66-03d0b62a26e8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 822.280266] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 822.280492] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 822.280740] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Deleting the datastore file [datastore2] f5558a78-c91f-4c36-bb22-f94b1bd8cdbc {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 822.280925] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2ddc4e54-9f08-4694-9e34-d9dcf8bff6aa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.286610] env[67169]: DEBUG oslo_vmware.api [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Waiting for the task: (returnval){ [ 822.286610] env[67169]: value = "task-2819113" [ 822.286610] env[67169]: _type = "Task" [ 822.286610] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 822.294306] env[67169]: DEBUG oslo_vmware.api [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Task: {'id': task-2819113, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 822.659387] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 822.659561] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 822.672148] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] There are 0 instances to clean {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 822.672432] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 822.672522] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances with incomplete migration {{(pid=67169) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 822.683633] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 822.724169] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 822.724447] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Creating directory with path [datastore2] vmware_temp/6ca0d245-f6dc-4906-8827-b28a0466ad1f/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 822.724688] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e5aac586-8bbe-4a71-bb7f-d333ee692f82 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.736415] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Created directory with path [datastore2] vmware_temp/6ca0d245-f6dc-4906-8827-b28a0466ad1f/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 822.736603] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Fetch image to [datastore2] vmware_temp/6ca0d245-f6dc-4906-8827-b28a0466ad1f/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 822.736773] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/6ca0d245-f6dc-4906-8827-b28a0466ad1f/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 822.737563] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-037de43a-620d-41ba-91bd-4f5637ef326d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.744268] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfe44269-dba9-4b47-b96e-1734f93f623a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.753459] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ff10fd3-cb80-455f-9f80-4d0caf964d7b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.787419] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2259fc48-3cd7-4258-b01d-ad37eb03566b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.799167] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8698fda4-fbf4-4307-b7b4-4837aaca2a23 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.801108] env[67169]: DEBUG oslo_vmware.api [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Task: {'id': task-2819113, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082973} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 822.801369] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 822.801560] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 822.801722] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 822.801888] env[67169]: INFO nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Took 0.61 seconds to destroy the instance on the hypervisor. [ 822.807016] env[67169]: DEBUG nova.compute.claims [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 822.807016] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.807016] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 822.834042] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 822.899439] env[67169]: DEBUG oslo_vmware.rw_handles [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6ca0d245-f6dc-4906-8827-b28a0466ad1f/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 822.961351] env[67169]: DEBUG oslo_vmware.rw_handles [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 822.961535] env[67169]: DEBUG oslo_vmware.rw_handles [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6ca0d245-f6dc-4906-8827-b28a0466ad1f/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 823.250173] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9991f400-4d86-48ad-a5fa-f22d6dea5c57 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.260076] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17ce184a-7231-428d-b530-604743d466ea {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.291772] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-980a360f-c3b0-4d0a-86fe-462c76e61d6d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.302348] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80412bc4-c68c-478c-8bf5-177ee5ce645d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.317910] env[67169]: DEBUG nova.compute.provider_tree [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 823.327916] env[67169]: DEBUG nova.scheduler.client.report [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 823.343104] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.538s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 823.343769] env[67169]: ERROR nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 823.343769] env[67169]: Faults: ['InvalidArgument'] [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Traceback (most recent call last): [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] self.driver.spawn(context, instance, image_meta, [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] self._fetch_image_if_missing(context, vi) [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] image_cache(vi, tmp_image_ds_loc) [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] vm_util.copy_virtual_disk( [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] session._wait_for_task(vmdk_copy_task) [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] return self.wait_for_task(task_ref) [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] return evt.wait() [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] result = hub.switch() [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] return self.greenlet.switch() [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] self.f(*self.args, **self.kw) [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] raise exceptions.translate_fault(task_info.error) [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Faults: ['InvalidArgument'] [ 823.343769] env[67169]: ERROR nova.compute.manager [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] [ 823.344770] env[67169]: DEBUG nova.compute.utils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 823.345994] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Build of instance f5558a78-c91f-4c36-bb22-f94b1bd8cdbc was re-scheduled: A specified parameter was not correct: fileType [ 823.345994] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 823.346389] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 823.346564] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 823.346743] env[67169]: DEBUG nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 823.346939] env[67169]: DEBUG nova.network.neutron [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 823.690180] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 823.778477] env[67169]: DEBUG nova.network.neutron [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 823.790784] env[67169]: INFO nova.compute.manager [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Took 0.44 seconds to deallocate network for instance. [ 823.904068] env[67169]: INFO nova.scheduler.client.report [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Deleted allocations for instance f5558a78-c91f-4c36-bb22-f94b1bd8cdbc [ 823.925193] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f30fd4a-9921-457d-a045-00b3f5f0612e tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 247.284s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 823.925517] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 238.240s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 823.925699] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] During sync_power_state the instance has a pending task (spawning). Skip. [ 823.926173] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 823.926816] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 45.293s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 823.927044] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Acquiring lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 823.927252] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 823.927638] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 823.930030] env[67169]: INFO nova.compute.manager [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Terminating instance [ 823.931548] env[67169]: DEBUG nova.compute.manager [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 823.935013] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 823.935013] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f28bdc8e-2f6f-4774-b427-6f4771c16e3c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.943466] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8678c24d-8d12-4577-b043-8b67fdd498df {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.952538] env[67169]: DEBUG nova.compute.manager [None req-421b8c1a-27a4-4ab8-86b0-47b05f78ea77 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 3ac7cc70-7667-43b0-a3b8-0c791ef7ccd2] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 823.973647] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f5558a78-c91f-4c36-bb22-f94b1bd8cdbc could not be found. [ 823.973854] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 823.974045] env[67169]: INFO nova.compute.manager [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Took 0.04 seconds to destroy the instance on the hypervisor. [ 823.974293] env[67169]: DEBUG oslo.service.loopingcall [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 823.974517] env[67169]: DEBUG nova.compute.manager [-] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 823.974613] env[67169]: DEBUG nova.network.neutron [-] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 823.981204] env[67169]: DEBUG nova.compute.manager [None req-421b8c1a-27a4-4ab8-86b0-47b05f78ea77 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 3ac7cc70-7667-43b0-a3b8-0c791ef7ccd2] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.001571] env[67169]: DEBUG oslo_concurrency.lockutils [None req-421b8c1a-27a4-4ab8-86b0-47b05f78ea77 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "3ac7cc70-7667-43b0-a3b8-0c791ef7ccd2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.651s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.005918] env[67169]: DEBUG nova.network.neutron [-] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 824.011186] env[67169]: DEBUG nova.compute.manager [None req-b97c6bbc-2563-48b6-bf6d-a435a705c13f tempest-ServerActionsTestJSON-1763696561 tempest-ServerActionsTestJSON-1763696561-project-member] [instance: 485cf92f-cf20-4f94-8a18-1a82501a829f] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.014770] env[67169]: INFO nova.compute.manager [-] [instance: f5558a78-c91f-4c36-bb22-f94b1bd8cdbc] Took 0.04 seconds to deallocate network for instance. [ 824.038386] env[67169]: DEBUG nova.compute.manager [None req-b97c6bbc-2563-48b6-bf6d-a435a705c13f tempest-ServerActionsTestJSON-1763696561 tempest-ServerActionsTestJSON-1763696561-project-member] [instance: 485cf92f-cf20-4f94-8a18-1a82501a829f] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.062081] env[67169]: DEBUG oslo_concurrency.lockutils [None req-b97c6bbc-2563-48b6-bf6d-a435a705c13f tempest-ServerActionsTestJSON-1763696561 tempest-ServerActionsTestJSON-1763696561-project-member] Lock "485cf92f-cf20-4f94-8a18-1a82501a829f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.130s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.074273] env[67169]: DEBUG nova.compute.manager [None req-5838ee6b-fce2-451c-88ae-bf4e0492c429 tempest-InstanceActionsV221TestJSON-96340564 tempest-InstanceActionsV221TestJSON-96340564-project-member] [instance: a6427d1b-e915-4e3a-a4dd-6758fde2bc56] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.097434] env[67169]: DEBUG nova.compute.manager [None req-5838ee6b-fce2-451c-88ae-bf4e0492c429 tempest-InstanceActionsV221TestJSON-96340564 tempest-InstanceActionsV221TestJSON-96340564-project-member] [instance: a6427d1b-e915-4e3a-a4dd-6758fde2bc56] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.155266] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d29e94ab-34c3-4daf-9971-4e81f5ddb110 tempest-ImagesNegativeTestJSON-7099264 tempest-ImagesNegativeTestJSON-7099264-project-member] Lock "f5558a78-c91f-4c36-bb22-f94b1bd8cdbc" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.228s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.158726] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5838ee6b-fce2-451c-88ae-bf4e0492c429 tempest-InstanceActionsV221TestJSON-96340564 tempest-InstanceActionsV221TestJSON-96340564-project-member] Lock "a6427d1b-e915-4e3a-a4dd-6758fde2bc56" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.030s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.168158] env[67169]: DEBUG nova.compute.manager [None req-ca250af0-c6e9-4d98-90c3-c0b84601a3ce tempest-ServerActionsTestOtherA-909278241 tempest-ServerActionsTestOtherA-909278241-project-member] [instance: bee054d2-ac3c-47cc-a946-90bebf23f925] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.191445] env[67169]: DEBUG nova.compute.manager [None req-ca250af0-c6e9-4d98-90c3-c0b84601a3ce tempest-ServerActionsTestOtherA-909278241 tempest-ServerActionsTestOtherA-909278241-project-member] [instance: bee054d2-ac3c-47cc-a946-90bebf23f925] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.210814] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca250af0-c6e9-4d98-90c3-c0b84601a3ce tempest-ServerActionsTestOtherA-909278241 tempest-ServerActionsTestOtherA-909278241-project-member] Lock "bee054d2-ac3c-47cc-a946-90bebf23f925" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.708s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.220836] env[67169]: DEBUG nova.compute.manager [None req-2f6d228f-1abd-4d97-91fe-a0b7fbfa7070 tempest-ServerDiagnosticsTest-1296817303 tempest-ServerDiagnosticsTest-1296817303-project-member] [instance: e7195659-b834-49e4-a9bd-6b2b7c7d4a20] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.243023] env[67169]: DEBUG nova.compute.manager [None req-2f6d228f-1abd-4d97-91fe-a0b7fbfa7070 tempest-ServerDiagnosticsTest-1296817303 tempest-ServerDiagnosticsTest-1296817303-project-member] [instance: e7195659-b834-49e4-a9bd-6b2b7c7d4a20] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.263779] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2f6d228f-1abd-4d97-91fe-a0b7fbfa7070 tempest-ServerDiagnosticsTest-1296817303 tempest-ServerDiagnosticsTest-1296817303-project-member] Lock "e7195659-b834-49e4-a9bd-6b2b7c7d4a20" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.817s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.272116] env[67169]: DEBUG nova.compute.manager [None req-b1b25f70-abf0-437d-a7db-a2d66ddc73cb tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] [instance: f001b21d-59fb-4a9e-9c28-7c15892facfa] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.297106] env[67169]: DEBUG nova.compute.manager [None req-b1b25f70-abf0-437d-a7db-a2d66ddc73cb tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] [instance: f001b21d-59fb-4a9e-9c28-7c15892facfa] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.320294] env[67169]: DEBUG oslo_concurrency.lockutils [None req-b1b25f70-abf0-437d-a7db-a2d66ddc73cb tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] Lock "f001b21d-59fb-4a9e-9c28-7c15892facfa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.558s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.333787] env[67169]: DEBUG nova.compute.manager [None req-18fa1f7c-17e9-4b29-85ee-a4ba563bb86f tempest-ServerDiagnosticsNegativeTest-1101645295 tempest-ServerDiagnosticsNegativeTest-1101645295-project-member] [instance: a7165020-4b1d-44e5-83d9-53eafbef74e7] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.358936] env[67169]: DEBUG nova.compute.manager [None req-18fa1f7c-17e9-4b29-85ee-a4ba563bb86f tempest-ServerDiagnosticsNegativeTest-1101645295 tempest-ServerDiagnosticsNegativeTest-1101645295-project-member] [instance: a7165020-4b1d-44e5-83d9-53eafbef74e7] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.384718] env[67169]: DEBUG oslo_concurrency.lockutils [None req-18fa1f7c-17e9-4b29-85ee-a4ba563bb86f tempest-ServerDiagnosticsNegativeTest-1101645295 tempest-ServerDiagnosticsNegativeTest-1101645295-project-member] Lock "a7165020-4b1d-44e5-83d9-53eafbef74e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.002s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.396100] env[67169]: DEBUG nova.compute.manager [None req-6086110c-8031-47b1-b002-03e1428679d4 tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] [instance: 45ce5bfe-4a85-4c26-914e-b85478fc45a4] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.419788] env[67169]: DEBUG nova.compute.manager [None req-6086110c-8031-47b1-b002-03e1428679d4 tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] [instance: 45ce5bfe-4a85-4c26-914e-b85478fc45a4] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.446693] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6086110c-8031-47b1-b002-03e1428679d4 tempest-ServerShowV247Test-1484429168 tempest-ServerShowV247Test-1484429168-project-member] Lock "45ce5bfe-4a85-4c26-914e-b85478fc45a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.990s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.458033] env[67169]: DEBUG nova.compute.manager [None req-9f3f0ffd-b8a1-48f6-873e-a0c5292157ed tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] [instance: 529ac98d-1e5c-4bcd-bb3d-7a7158e952cb] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.482439] env[67169]: DEBUG nova.compute.manager [None req-9f3f0ffd-b8a1-48f6-873e-a0c5292157ed tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] [instance: 529ac98d-1e5c-4bcd-bb3d-7a7158e952cb] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.502718] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f3f0ffd-b8a1-48f6-873e-a0c5292157ed tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] Lock "529ac98d-1e5c-4bcd-bb3d-7a7158e952cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.223s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.511688] env[67169]: DEBUG nova.compute.manager [None req-4c32cf04-40f3-4f28-9587-80c7302bd097 tempest-ImagesOneServerTestJSON-1174559401 tempest-ImagesOneServerTestJSON-1174559401-project-member] [instance: 774358a1-c887-497e-b2d8-59a7c10e2329] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.534704] env[67169]: DEBUG nova.compute.manager [None req-4c32cf04-40f3-4f28-9587-80c7302bd097 tempest-ImagesOneServerTestJSON-1174559401 tempest-ImagesOneServerTestJSON-1174559401-project-member] [instance: 774358a1-c887-497e-b2d8-59a7c10e2329] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.555108] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4c32cf04-40f3-4f28-9587-80c7302bd097 tempest-ImagesOneServerTestJSON-1174559401 tempest-ImagesOneServerTestJSON-1174559401-project-member] Lock "774358a1-c887-497e-b2d8-59a7c10e2329" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.534s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.565593] env[67169]: DEBUG nova.compute.manager [None req-bb146bb8-6c16-497f-804c-ba73c7d144b7 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] [instance: a7434fa2-563e-4f77-ba3e-40ad6bab0de3] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.593250] env[67169]: DEBUG nova.compute.manager [None req-bb146bb8-6c16-497f-804c-ba73c7d144b7 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] [instance: a7434fa2-563e-4f77-ba3e-40ad6bab0de3] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.613801] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bb146bb8-6c16-497f-804c-ba73c7d144b7 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] Lock "a7434fa2-563e-4f77-ba3e-40ad6bab0de3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.199s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.627241] env[67169]: DEBUG nova.compute.manager [None req-e69cba99-3a03-4f76-b886-d9e0ce370610 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] [instance: fd4c9c56-1608-4390-8b41-736b3aa590ed] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.650855] env[67169]: DEBUG nova.compute.manager [None req-e69cba99-3a03-4f76-b886-d9e0ce370610 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] [instance: fd4c9c56-1608-4390-8b41-736b3aa590ed] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 824.659206] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 824.668939] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 824.669172] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 824.669338] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.669490] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 824.670583] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20966f30-6bf4-4054-8a59-ade1dd9c663e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.674332] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e69cba99-3a03-4f76-b886-d9e0ce370610 tempest-ListImageFiltersTestJSON-1430329864 tempest-ListImageFiltersTestJSON-1430329864-project-member] Lock "fd4c9c56-1608-4390-8b41-736b3aa590ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.379s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.681760] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a0ce1dd-bbeb-49a8-9ef8-26dc24261bc2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.686843] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 824.697478] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d084fce4-cc36-4e20-8c83-83c2e6edf36e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.704570] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e1b4e8f-640e-41a9-83d2-4ccf655f129e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.735060] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181046MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 824.735198] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 824.735386] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 824.762515] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 824.806061] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 11e90c91-26ca-4397-81a4-975a1d714d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 824.806236] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 36781827-5846-49a4-8913-d98676af0b74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 824.806363] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 824.806486] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 85978a3b-052a-4a05-84e6-75c723d49bd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 824.806603] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e2e52693-153a-43dd-b786-dd0758caabe2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 824.806719] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 824.806873] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 28552f70-695d-40cc-8dfa-bf40d6113220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 824.807029] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 824.807153] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 824.822563] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.835662] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6b21ae30-8734-4c38-a8ae-c3fe03b6c36a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.847373] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a578f813-807b-46bc-987f-5c9e9368c04b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.859430] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance f384f4a3-d34d-4d45-b063-79b25ea3c66c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.869064] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0aa23be1-af16-4c0b-bfd0-4db5e927cfc4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.879295] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a58a8711-b060-4036-ac43-897017f68d21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.889647] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 79d35dc5-e515-4f6f-9160-534d84f534bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.903545] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance db96dc14-cb08-4302-8aa3-81cf0c47fc73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.913655] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d7477c-c3ff-42c7-9107-f54327c2f4b2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.924268] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1fe4f2aa-0784-4356-aa4c-593666f22971 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.936587] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.947680] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.960730] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.971254] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 41ebddaf-e07d-4925-b9da-758b8e83f545 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.981849] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3c5bc03a-acc3-4601-8155-2cab101be865 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 824.991911] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3ea7c620-5903-410d-8ca0-68789a5e5194 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 825.004550] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance d0a7fdac-3e41-4539-bef4-0442bc5ad674 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 825.015528] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2aa47f31-4da8-4ef6-b28c-fe2a03bf8906 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 825.015786] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 825.015999] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 825.385490] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af530e4f-eb2b-4cd0-98bf-50045f88cbdd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.391414] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59b917d1-ce3d-41c4-9aa8-444943724add {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.421216] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48bb8871-319f-4bec-99dd-af6444e8c016 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.428716] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7d93e96-7ae9-42eb-8ccf-c193aef72ae3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.442641] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 825.452416] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 825.470206] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 825.470400] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.735s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 825.470661] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.708s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 825.472270] env[67169]: INFO nova.compute.claims [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 825.904141] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a39d82-04e1-42f9-955f-1b06b6255756 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.912954] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17b2a783-7f2c-4a80-93bf-08610225944b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.948706] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2353aa45-6861-4ff6-8cf1-cc1dfbf801a7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.958944] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e711a678-d709-4539-86ab-c1ebca3cfecc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.972604] env[67169]: DEBUG nova.compute.provider_tree [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 825.980847] env[67169]: DEBUG nova.scheduler.client.report [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 826.000641] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.530s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 826.001510] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 826.048834] env[67169]: DEBUG nova.compute.utils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 826.054342] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 826.054542] env[67169]: DEBUG nova.network.neutron [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 826.060822] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 826.133023] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 826.135139] env[67169]: DEBUG nova.policy [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d789ec14c2b4d62be952753fb47f0f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '00d358bc61014b5cb3ddcdab7785e7e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 826.157571] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 826.157571] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 826.157571] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 826.157571] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 826.157571] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 826.157571] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 826.157571] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 826.157571] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 826.157983] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 826.157983] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 826.157983] env[67169]: DEBUG nova.virt.hardware [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 826.158844] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b8c9df1-08b3-4fd5-841b-99ff43a0a7f2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 826.172021] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-042c2809-099c-4e32-a39c-6b1c9ed95a0e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 826.472138] env[67169]: DEBUG oslo_concurrency.lockutils [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 826.474624] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 826.474862] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 826.475379] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 826.503106] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.503418] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 36781827-5846-49a4-8913-d98676af0b74] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.503539] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.503753] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.503912] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.504100] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.504282] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.504510] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.504620] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.504815] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 826.504985] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 826.658874] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 826.658874] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 826.658874] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 826.658874] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 826.892990] env[67169]: DEBUG nova.network.neutron [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Successfully created port: 96c925e6-dab9-45f5-842f-4fa0b829c192 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 827.659176] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 827.969713] env[67169]: DEBUG nova.network.neutron [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Successfully updated port: 96c925e6-dab9-45f5-842f-4fa0b829c192 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 827.990809] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "refresh_cache-1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 827.990809] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "refresh_cache-1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 827.990953] env[67169]: DEBUG nova.network.neutron [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 828.054832] env[67169]: DEBUG nova.network.neutron [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 828.148230] env[67169]: DEBUG nova.compute.manager [req-4e3b439c-ffe3-4b36-a031-6b150d454de3 req-a7268372-4d3c-4cce-a232-e1d2e4b61113 service nova] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Received event network-vif-plugged-96c925e6-dab9-45f5-842f-4fa0b829c192 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 828.148358] env[67169]: DEBUG oslo_concurrency.lockutils [req-4e3b439c-ffe3-4b36-a031-6b150d454de3 req-a7268372-4d3c-4cce-a232-e1d2e4b61113 service nova] Acquiring lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 828.148562] env[67169]: DEBUG oslo_concurrency.lockutils [req-4e3b439c-ffe3-4b36-a031-6b150d454de3 req-a7268372-4d3c-4cce-a232-e1d2e4b61113 service nova] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 828.148727] env[67169]: DEBUG oslo_concurrency.lockutils [req-4e3b439c-ffe3-4b36-a031-6b150d454de3 req-a7268372-4d3c-4cce-a232-e1d2e4b61113 service nova] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 828.148914] env[67169]: DEBUG nova.compute.manager [req-4e3b439c-ffe3-4b36-a031-6b150d454de3 req-a7268372-4d3c-4cce-a232-e1d2e4b61113 service nova] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] No waiting events found dispatching network-vif-plugged-96c925e6-dab9-45f5-842f-4fa0b829c192 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 828.149751] env[67169]: WARNING nova.compute.manager [req-4e3b439c-ffe3-4b36-a031-6b150d454de3 req-a7268372-4d3c-4cce-a232-e1d2e4b61113 service nova] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Received unexpected event network-vif-plugged-96c925e6-dab9-45f5-842f-4fa0b829c192 for instance with vm_state building and task_state deleting. [ 828.534287] env[67169]: DEBUG nova.network.neutron [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Updating instance_info_cache with network_info: [{"id": "96c925e6-dab9-45f5-842f-4fa0b829c192", "address": "fa:16:3e:cf:57:1c", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap96c925e6-da", "ovs_interfaceid": "96c925e6-dab9-45f5-842f-4fa0b829c192", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 828.548424] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "refresh_cache-1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 828.548736] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Instance network_info: |[{"id": "96c925e6-dab9-45f5-842f-4fa0b829c192", "address": "fa:16:3e:cf:57:1c", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap96c925e6-da", "ovs_interfaceid": "96c925e6-dab9-45f5-842f-4fa0b829c192", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 828.549206] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cf:57:1c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '20e3f794-c7a3-4696-9488-ecf34c570ef9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '96c925e6-dab9-45f5-842f-4fa0b829c192', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 828.557051] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating folder: Project (00d358bc61014b5cb3ddcdab7785e7e8). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 828.557661] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e67b48b3-e46d-41da-8e2f-e302da419a7c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.569577] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created folder: Project (00d358bc61014b5cb3ddcdab7785e7e8) in parent group-v566843. [ 828.569577] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating folder: Instances. Parent ref: group-v566890. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 828.569577] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-19f03d35-beb3-444e-8cad-891d366a96b5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.577693] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created folder: Instances in parent group-v566890. [ 828.578170] env[67169]: DEBUG oslo.service.loopingcall [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 828.578374] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 828.578580] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-49690fcc-bb46-4984-87a5-b6766c0e0a1d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.599659] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 828.599659] env[67169]: value = "task-2819116" [ 828.599659] env[67169]: _type = "Task" [ 828.599659] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 828.610158] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819116, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 828.659468] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 828.660028] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 829.110689] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819116, 'name': CreateVM_Task, 'duration_secs': 0.300551} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 829.110950] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 829.111754] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 829.111992] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 829.112496] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 829.112829] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2abbe4a6-0331-4d97-ab13-e571280881c3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.118570] env[67169]: DEBUG oslo_vmware.api [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 829.118570] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a0dd60-04e3-fdf5-0f1c-6d4624ff4bf1" [ 829.118570] env[67169]: _type = "Task" [ 829.118570] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 829.128924] env[67169]: DEBUG oslo_vmware.api [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a0dd60-04e3-fdf5-0f1c-6d4624ff4bf1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 829.631508] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 829.631823] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 829.632084] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 830.501972] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquiring lock "310ae1ce-4717-4807-901c-5674677682c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 830.502305] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "310ae1ce-4717-4807-901c-5674677682c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 830.821999] env[67169]: DEBUG nova.compute.manager [req-50648b09-1142-4abc-8857-7fda41588ae3 req-cc6779b8-4c23-4a35-9a70-6398e0515088 service nova] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Received event network-changed-96c925e6-dab9-45f5-842f-4fa0b829c192 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 830.822231] env[67169]: DEBUG nova.compute.manager [req-50648b09-1142-4abc-8857-7fda41588ae3 req-cc6779b8-4c23-4a35-9a70-6398e0515088 service nova] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Refreshing instance network info cache due to event network-changed-96c925e6-dab9-45f5-842f-4fa0b829c192. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 830.822452] env[67169]: DEBUG oslo_concurrency.lockutils [req-50648b09-1142-4abc-8857-7fda41588ae3 req-cc6779b8-4c23-4a35-9a70-6398e0515088 service nova] Acquiring lock "refresh_cache-1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 830.822599] env[67169]: DEBUG oslo_concurrency.lockutils [req-50648b09-1142-4abc-8857-7fda41588ae3 req-cc6779b8-4c23-4a35-9a70-6398e0515088 service nova] Acquired lock "refresh_cache-1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 830.822758] env[67169]: DEBUG nova.network.neutron [req-50648b09-1142-4abc-8857-7fda41588ae3 req-cc6779b8-4c23-4a35-9a70-6398e0515088 service nova] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Refreshing network info cache for port 96c925e6-dab9-45f5-842f-4fa0b829c192 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 831.408288] env[67169]: DEBUG nova.network.neutron [req-50648b09-1142-4abc-8857-7fda41588ae3 req-cc6779b8-4c23-4a35-9a70-6398e0515088 service nova] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Updated VIF entry in instance network info cache for port 96c925e6-dab9-45f5-842f-4fa0b829c192. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 831.408629] env[67169]: DEBUG nova.network.neutron [req-50648b09-1142-4abc-8857-7fda41588ae3 req-cc6779b8-4c23-4a35-9a70-6398e0515088 service nova] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Updating instance_info_cache with network_info: [{"id": "96c925e6-dab9-45f5-842f-4fa0b829c192", "address": "fa:16:3e:cf:57:1c", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap96c925e6-da", "ovs_interfaceid": "96c925e6-dab9-45f5-842f-4fa0b829c192", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 831.418628] env[67169]: DEBUG oslo_concurrency.lockutils [req-50648b09-1142-4abc-8857-7fda41588ae3 req-cc6779b8-4c23-4a35-9a70-6398e0515088 service nova] Releasing lock "refresh_cache-1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 837.490164] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dc9c73e-95b5-4ebc-a938-23f4adfd1752 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] Acquiring lock "32412c58-a231-40f7-a248-3e46fad5f5b2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 837.490519] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dc9c73e-95b5-4ebc-a938-23f4adfd1752 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] Lock "32412c58-a231-40f7-a248-3e46fad5f5b2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 842.134766] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f97b7a6-ba6f-461c-81b3-2c7be85e3cf6 tempest-ServerAddressesTestJSON-1751036838 tempest-ServerAddressesTestJSON-1751036838-project-member] Acquiring lock "86b2381b-676f-46fc-9317-81c0fd272069" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 842.135064] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f97b7a6-ba6f-461c-81b3-2c7be85e3cf6 tempest-ServerAddressesTestJSON-1751036838 tempest-ServerAddressesTestJSON-1751036838-project-member] Lock "86b2381b-676f-46fc-9317-81c0fd272069" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 846.453039] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a4f964da-83fd-4c4e-9a9c-0a3b29f5dc8a tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] Acquiring lock "dac3617f-32fd-43c5-b8b5-fddf42d94f88" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 846.453412] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a4f964da-83fd-4c4e-9a9c-0a3b29f5dc8a tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] Lock "dac3617f-32fd-43c5-b8b5-fddf42d94f88" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 850.814346] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7011d1c8-bd90-44bc-9571-5dca86d4a021 tempest-ServerTagsTestJSON-1585154186 tempest-ServerTagsTestJSON-1585154186-project-member] Acquiring lock "fc12247e-bcca-4635-ba27-be1c9aeaa368" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.814652] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7011d1c8-bd90-44bc-9571-5dca86d4a021 tempest-ServerTagsTestJSON-1585154186 tempest-ServerTagsTestJSON-1585154186-project-member] Lock "fc12247e-bcca-4635-ba27-be1c9aeaa368" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 869.128039] env[67169]: WARNING oslo_vmware.rw_handles [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 869.128039] env[67169]: ERROR oslo_vmware.rw_handles [ 869.128039] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/6ca0d245-f6dc-4906-8827-b28a0466ad1f/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 869.129752] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 869.129994] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Copying Virtual Disk [datastore2] vmware_temp/6ca0d245-f6dc-4906-8827-b28a0466ad1f/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/6ca0d245-f6dc-4906-8827-b28a0466ad1f/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 869.130312] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6c59b7d4-e4ea-4c4c-9bab-68f70ca5d969 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.137857] env[67169]: DEBUG oslo_vmware.api [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Waiting for the task: (returnval){ [ 869.137857] env[67169]: value = "task-2819117" [ 869.137857] env[67169]: _type = "Task" [ 869.137857] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 869.146219] env[67169]: DEBUG oslo_vmware.api [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Task: {'id': task-2819117, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 869.647976] env[67169]: DEBUG oslo_vmware.exceptions [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 869.648334] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 869.648922] env[67169]: ERROR nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 869.648922] env[67169]: Faults: ['InvalidArgument'] [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] Traceback (most recent call last): [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] yield resources [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] self.driver.spawn(context, instance, image_meta, [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] self._vmops.spawn(context, instance, image_meta, injected_files, [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] self._fetch_image_if_missing(context, vi) [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] image_cache(vi, tmp_image_ds_loc) [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] vm_util.copy_virtual_disk( [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] session._wait_for_task(vmdk_copy_task) [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] return self.wait_for_task(task_ref) [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] return evt.wait() [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] result = hub.switch() [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] return self.greenlet.switch() [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] self.f(*self.args, **self.kw) [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] raise exceptions.translate_fault(task_info.error) [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] Faults: ['InvalidArgument'] [ 869.648922] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] [ 869.650128] env[67169]: INFO nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Terminating instance [ 869.650855] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 869.651074] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 869.651695] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 869.651887] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 869.652127] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d0c3e626-7929-414a-9a79-89808b05a448 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.654303] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bee1dd2-044d-48dc-ac72-f4782cf7bbc7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.661189] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 869.661424] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-58eab524-60d0-49f6-8581-759e998fc98b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.663539] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 869.663737] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 869.664632] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f762a4dd-5dc3-45d1-afd4-515cb6f727fb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.669052] env[67169]: DEBUG oslo_vmware.api [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Waiting for the task: (returnval){ [ 869.669052] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52fe3e92-54ad-5d67-1d8b-34b181879e95" [ 869.669052] env[67169]: _type = "Task" [ 869.669052] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 869.676248] env[67169]: DEBUG oslo_vmware.api [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52fe3e92-54ad-5d67-1d8b-34b181879e95, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 869.734025] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 869.734025] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 869.734025] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Deleting the datastore file [datastore2] 36781827-5846-49a4-8913-d98676af0b74 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 869.734025] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f8d9e4f5-a462-4858-981f-6a255bc77566 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.739718] env[67169]: DEBUG oslo_vmware.api [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Waiting for the task: (returnval){ [ 869.739718] env[67169]: value = "task-2819119" [ 869.739718] env[67169]: _type = "Task" [ 869.739718] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 869.747928] env[67169]: DEBUG oslo_vmware.api [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Task: {'id': task-2819119, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 870.179737] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 870.179989] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Creating directory with path [datastore2] vmware_temp/c1a79569-e5a4-4829-a1e3-6aab5be34d9d/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 870.180595] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a124b6e1-e304-4ad8-a108-446208eb46a6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.191993] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Created directory with path [datastore2] vmware_temp/c1a79569-e5a4-4829-a1e3-6aab5be34d9d/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 870.192303] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Fetch image to [datastore2] vmware_temp/c1a79569-e5a4-4829-a1e3-6aab5be34d9d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 870.192511] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/c1a79569-e5a4-4829-a1e3-6aab5be34d9d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 870.194532] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-242e662e-fe3b-4a63-8ea2-bf2d3a913218 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.201820] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d68eff0-901f-45bb-8ff1-320cef03e63e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.210978] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-befe23f5-907e-4440-9218-ff0ae122ee9b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.244928] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6a5dd32-b109-4fb8-968b-8855ccb6e48c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.251799] env[67169]: DEBUG oslo_vmware.api [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Task: {'id': task-2819119, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08163} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 870.253298] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 870.253527] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 870.253654] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 870.253826] env[67169]: INFO nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Took 0.60 seconds to destroy the instance on the hypervisor. [ 870.255550] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cbe5141d-cc09-4ae9-b480-d56a0bcd97c2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.257435] env[67169]: DEBUG nova.compute.claims [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 870.257607] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 870.257942] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 870.278972] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 870.332783] env[67169]: DEBUG oslo_vmware.rw_handles [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c1a79569-e5a4-4829-a1e3-6aab5be34d9d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 870.390896] env[67169]: DEBUG nova.scheduler.client.report [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Refreshing inventories for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 870.393741] env[67169]: DEBUG oslo_vmware.rw_handles [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 870.393741] env[67169]: DEBUG oslo_vmware.rw_handles [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c1a79569-e5a4-4829-a1e3-6aab5be34d9d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 870.407395] env[67169]: DEBUG nova.scheduler.client.report [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Updating ProviderTree inventory for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 870.407612] env[67169]: DEBUG nova.compute.provider_tree [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Updating inventory in ProviderTree for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 870.419385] env[67169]: DEBUG nova.scheduler.client.report [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Refreshing aggregate associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, aggregates: None {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 870.438750] env[67169]: DEBUG nova.scheduler.client.report [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Refreshing trait associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 870.702531] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01725b5a-47db-4177-85f0-d5bf22899b66 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.710704] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeb04bbd-0483-4811-b764-507246fd0061 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.740775] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-346e07a9-bd9f-47ba-ab7b-3108c9378562 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.748151] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18bf3c95-2cbf-416e-9f11-57199a32c4c6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.762278] env[67169]: DEBUG nova.compute.provider_tree [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 870.770364] env[67169]: DEBUG nova.scheduler.client.report [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 870.784636] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.526s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 870.785127] env[67169]: ERROR nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 870.785127] env[67169]: Faults: ['InvalidArgument'] [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] Traceback (most recent call last): [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] self.driver.spawn(context, instance, image_meta, [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] self._vmops.spawn(context, instance, image_meta, injected_files, [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] self._fetch_image_if_missing(context, vi) [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] image_cache(vi, tmp_image_ds_loc) [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] vm_util.copy_virtual_disk( [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] session._wait_for_task(vmdk_copy_task) [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] return self.wait_for_task(task_ref) [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] return evt.wait() [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] result = hub.switch() [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] return self.greenlet.switch() [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] self.f(*self.args, **self.kw) [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] raise exceptions.translate_fault(task_info.error) [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] Faults: ['InvalidArgument'] [ 870.785127] env[67169]: ERROR nova.compute.manager [instance: 36781827-5846-49a4-8913-d98676af0b74] [ 870.786117] env[67169]: DEBUG nova.compute.utils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 870.787336] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Build of instance 36781827-5846-49a4-8913-d98676af0b74 was re-scheduled: A specified parameter was not correct: fileType [ 870.787336] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 870.787706] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 870.787876] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 870.788056] env[67169]: DEBUG nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 870.788219] env[67169]: DEBUG nova.network.neutron [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 871.158403] env[67169]: DEBUG nova.network.neutron [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 871.171122] env[67169]: INFO nova.compute.manager [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Took 0.38 seconds to deallocate network for instance. [ 871.275990] env[67169]: INFO nova.scheduler.client.report [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Deleted allocations for instance 36781827-5846-49a4-8913-d98676af0b74 [ 871.305344] env[67169]: DEBUG oslo_concurrency.lockutils [None req-368ce919-6530-42e6-9ef4-1d52f0d0483a tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "36781827-5846-49a4-8913-d98676af0b74" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 289.600s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.306587] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "36781827-5846-49a4-8913-d98676af0b74" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 285.620s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 871.306775] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 36781827-5846-49a4-8913-d98676af0b74] During sync_power_state the instance has a pending task (spawning). Skip. [ 871.306952] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "36781827-5846-49a4-8913-d98676af0b74" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.307615] env[67169]: DEBUG oslo_concurrency.lockutils [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "36781827-5846-49a4-8913-d98676af0b74" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 88.585s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 871.307978] env[67169]: DEBUG oslo_concurrency.lockutils [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Acquiring lock "36781827-5846-49a4-8913-d98676af0b74-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 871.308146] env[67169]: DEBUG oslo_concurrency.lockutils [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "36781827-5846-49a4-8913-d98676af0b74-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 871.308212] env[67169]: DEBUG oslo_concurrency.lockutils [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "36781827-5846-49a4-8913-d98676af0b74-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.310512] env[67169]: INFO nova.compute.manager [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Terminating instance [ 871.312448] env[67169]: DEBUG nova.compute.manager [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 871.312656] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 871.312918] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f1cf79c0-c269-4168-96d2-e7c13e1c6027 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 871.322009] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df10fcef-1fe6-4760-abad-3fd10a792f5b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 871.333304] env[67169]: DEBUG nova.compute.manager [None req-1866c171-8b5b-476e-9836-38ffb422c083 tempest-ServersTestJSON-1279073989 tempest-ServersTestJSON-1279073989-project-member] [instance: 6b21ae30-8734-4c38-a8ae-c3fe03b6c36a] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.354969] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 36781827-5846-49a4-8913-d98676af0b74 could not be found. [ 871.355782] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 871.356293] env[67169]: INFO nova.compute.manager [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] [instance: 36781827-5846-49a4-8913-d98676af0b74] Took 0.04 seconds to destroy the instance on the hypervisor. [ 871.356633] env[67169]: DEBUG oslo.service.loopingcall [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 871.356819] env[67169]: DEBUG nova.compute.manager [-] [instance: 36781827-5846-49a4-8913-d98676af0b74] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 871.356918] env[67169]: DEBUG nova.network.neutron [-] [instance: 36781827-5846-49a4-8913-d98676af0b74] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 871.365609] env[67169]: DEBUG nova.compute.manager [None req-1866c171-8b5b-476e-9836-38ffb422c083 tempest-ServersTestJSON-1279073989 tempest-ServersTestJSON-1279073989-project-member] [instance: 6b21ae30-8734-4c38-a8ae-c3fe03b6c36a] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 871.385258] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1866c171-8b5b-476e-9836-38ffb422c083 tempest-ServersTestJSON-1279073989 tempest-ServersTestJSON-1279073989-project-member] Lock "6b21ae30-8734-4c38-a8ae-c3fe03b6c36a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.831s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.397023] env[67169]: DEBUG nova.compute.manager [None req-e145ebc9-92d9-4148-9de6-53e5b64ff837 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] [instance: a578f813-807b-46bc-987f-5c9e9368c04b] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.399512] env[67169]: DEBUG nova.network.neutron [-] [instance: 36781827-5846-49a4-8913-d98676af0b74] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 871.409711] env[67169]: INFO nova.compute.manager [-] [instance: 36781827-5846-49a4-8913-d98676af0b74] Took 0.05 seconds to deallocate network for instance. [ 871.418408] env[67169]: DEBUG nova.compute.manager [None req-e145ebc9-92d9-4148-9de6-53e5b64ff837 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] [instance: a578f813-807b-46bc-987f-5c9e9368c04b] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 871.442535] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e145ebc9-92d9-4148-9de6-53e5b64ff837 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] Lock "a578f813-807b-46bc-987f-5c9e9368c04b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.203s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.455262] env[67169]: DEBUG nova.compute.manager [None req-2bdba451-044d-4432-8f1d-46c6fa3dd51d tempest-ServersTestManualDisk-573119586 tempest-ServersTestManualDisk-573119586-project-member] [instance: f384f4a3-d34d-4d45-b063-79b25ea3c66c] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.482749] env[67169]: DEBUG nova.compute.manager [None req-2bdba451-044d-4432-8f1d-46c6fa3dd51d tempest-ServersTestManualDisk-573119586 tempest-ServersTestManualDisk-573119586-project-member] [instance: f384f4a3-d34d-4d45-b063-79b25ea3c66c] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 871.511287] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bdba451-044d-4432-8f1d-46c6fa3dd51d tempest-ServersTestManualDisk-573119586 tempest-ServersTestManualDisk-573119586-project-member] Lock "f384f4a3-d34d-4d45-b063-79b25ea3c66c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.967s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.516480] env[67169]: DEBUG oslo_concurrency.lockutils [None req-685f9846-3608-45da-8f06-ff8ab7599bfe tempest-AttachInterfacesUnderV243Test-137593631 tempest-AttachInterfacesUnderV243Test-137593631-project-member] Lock "36781827-5846-49a4-8913-d98676af0b74" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.209s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.523009] env[67169]: DEBUG nova.compute.manager [None req-6aeff5e1-4ca3-4e66-9b16-082d97563389 tempest-ServerActionsV293TestJSON-1935842596 tempest-ServerActionsV293TestJSON-1935842596-project-member] [instance: 0aa23be1-af16-4c0b-bfd0-4db5e927cfc4] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.546830] env[67169]: DEBUG nova.compute.manager [None req-6aeff5e1-4ca3-4e66-9b16-082d97563389 tempest-ServerActionsV293TestJSON-1935842596 tempest-ServerActionsV293TestJSON-1935842596-project-member] [instance: 0aa23be1-af16-4c0b-bfd0-4db5e927cfc4] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 871.568308] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6aeff5e1-4ca3-4e66-9b16-082d97563389 tempest-ServerActionsV293TestJSON-1935842596 tempest-ServerActionsV293TestJSON-1935842596-project-member] Lock "0aa23be1-af16-4c0b-bfd0-4db5e927cfc4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.181s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.578115] env[67169]: DEBUG nova.compute.manager [None req-179a1fab-b15e-400e-ab43-2abae1ed8c7d tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] [instance: a58a8711-b060-4036-ac43-897017f68d21] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.603471] env[67169]: DEBUG nova.compute.manager [None req-179a1fab-b15e-400e-ab43-2abae1ed8c7d tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] [instance: a58a8711-b060-4036-ac43-897017f68d21] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 871.628021] env[67169]: DEBUG oslo_concurrency.lockutils [None req-179a1fab-b15e-400e-ab43-2abae1ed8c7d tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] Lock "a58a8711-b060-4036-ac43-897017f68d21" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.039s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.638150] env[67169]: DEBUG nova.compute.manager [None req-5efb4fdc-d030-4885-b530-ccb865c8d016 tempest-ServersAaction247Test-1711594299 tempest-ServersAaction247Test-1711594299-project-member] [instance: 79d35dc5-e515-4f6f-9160-534d84f534bd] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.662134] env[67169]: DEBUG nova.compute.manager [None req-5efb4fdc-d030-4885-b530-ccb865c8d016 tempest-ServersAaction247Test-1711594299 tempest-ServersAaction247Test-1711594299-project-member] [instance: 79d35dc5-e515-4f6f-9160-534d84f534bd] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 871.684797] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5efb4fdc-d030-4885-b530-ccb865c8d016 tempest-ServersAaction247Test-1711594299 tempest-ServersAaction247Test-1711594299-project-member] Lock "79d35dc5-e515-4f6f-9160-534d84f534bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.621s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.694209] env[67169]: DEBUG nova.compute.manager [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] [instance: db96dc14-cb08-4302-8aa3-81cf0c47fc73] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.723381] env[67169]: DEBUG nova.compute.manager [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] [instance: db96dc14-cb08-4302-8aa3-81cf0c47fc73] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 871.746038] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] Lock "db96dc14-cb08-4302-8aa3-81cf0c47fc73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.220s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.757549] env[67169]: DEBUG nova.compute.manager [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] [instance: 04d7477c-c3ff-42c7-9107-f54327c2f4b2] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.780830] env[67169]: DEBUG nova.compute.manager [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] [instance: 04d7477c-c3ff-42c7-9107-f54327c2f4b2] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 871.801777] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] Lock "04d7477c-c3ff-42c7-9107-f54327c2f4b2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.250s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.810248] env[67169]: DEBUG nova.compute.manager [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] [instance: 1fe4f2aa-0784-4356-aa4c-593666f22971] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.834029] env[67169]: DEBUG nova.compute.manager [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] [instance: 1fe4f2aa-0784-4356-aa4c-593666f22971] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 871.856919] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f06012ed-5139-43e2-a09d-91608d74a15b tempest-ListServersNegativeTestJSON-1270486368 tempest-ListServersNegativeTestJSON-1270486368-project-member] Lock "1fe4f2aa-0784-4356-aa4c-593666f22971" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.278s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 871.869353] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 871.926018] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 871.926018] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 871.926018] env[67169]: INFO nova.compute.claims [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 872.202175] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1c90bec-28b0-4792-b8b8-4946c38925a8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.211319] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f27e10bf-691e-48c7-ba86-78f5293cd717 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.241051] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91fc512f-0948-4103-819e-fd3b5b210a33 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.248185] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c2ac770-1fdd-46c4-987c-2fa4d0e56851 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.261285] env[67169]: DEBUG nova.compute.provider_tree [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 872.270545] env[67169]: DEBUG nova.scheduler.client.report [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 872.286537] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 872.287140] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 872.321730] env[67169]: DEBUG nova.compute.utils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 872.323841] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 872.324115] env[67169]: DEBUG nova.network.neutron [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 872.332579] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 872.399886] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 872.403217] env[67169]: DEBUG nova.policy [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3be235b9668e4a97b50ee275f177c210', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f4120332e24d456cabca8ee50ceaf0d7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 872.427909] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 872.428158] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 872.428317] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 872.428497] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 872.428642] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 872.428786] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 872.428988] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 872.429190] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 872.429363] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 872.429534] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 872.429743] env[67169]: DEBUG nova.virt.hardware [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 872.430658] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34f9ef48-6739-48ef-a1be-5a6402ad15b9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.438205] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-552a4a39-7157-40f6-8e03-1b96fc2a8b5a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.905921] env[67169]: DEBUG nova.network.neutron [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Successfully created port: 486680ca-6e09-4fb7-b37f-20f7a3c2ebb5 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 874.265881] env[67169]: DEBUG nova.network.neutron [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Successfully updated port: 486680ca-6e09-4fb7-b37f-20f7a3c2ebb5 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 874.278405] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquiring lock "refresh_cache-47ffcce9-3afc-41be-b38e-dacfeb535a2c" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 874.278559] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquired lock "refresh_cache-47ffcce9-3afc-41be-b38e-dacfeb535a2c" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 874.278704] env[67169]: DEBUG nova.network.neutron [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 874.367368] env[67169]: DEBUG nova.network.neutron [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 874.643548] env[67169]: DEBUG nova.network.neutron [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Updating instance_info_cache with network_info: [{"id": "486680ca-6e09-4fb7-b37f-20f7a3c2ebb5", "address": "fa:16:3e:2c:80:20", "network": {"id": "ed8a4d36-d0af-4797-9c09-4a9c97ca25e4", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-175679285-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4120332e24d456cabca8ee50ceaf0d7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27681cba-790d-451e-9d12-d179871f375a", "external-id": "cl2-zone-147", "segmentation_id": 147, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap486680ca-6e", "ovs_interfaceid": "486680ca-6e09-4fb7-b37f-20f7a3c2ebb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 874.657408] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Releasing lock "refresh_cache-47ffcce9-3afc-41be-b38e-dacfeb535a2c" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 874.657715] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Instance network_info: |[{"id": "486680ca-6e09-4fb7-b37f-20f7a3c2ebb5", "address": "fa:16:3e:2c:80:20", "network": {"id": "ed8a4d36-d0af-4797-9c09-4a9c97ca25e4", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-175679285-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4120332e24d456cabca8ee50ceaf0d7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27681cba-790d-451e-9d12-d179871f375a", "external-id": "cl2-zone-147", "segmentation_id": 147, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap486680ca-6e", "ovs_interfaceid": "486680ca-6e09-4fb7-b37f-20f7a3c2ebb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 874.658462] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2c:80:20', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27681cba-790d-451e-9d12-d179871f375a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '486680ca-6e09-4fb7-b37f-20f7a3c2ebb5', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 874.665871] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Creating folder: Project (f4120332e24d456cabca8ee50ceaf0d7). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 874.666748] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-af7de9cc-2c40-4ab9-a774-35a1da61e620 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 874.677051] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Created folder: Project (f4120332e24d456cabca8ee50ceaf0d7) in parent group-v566843. [ 874.677051] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Creating folder: Instances. Parent ref: group-v566893. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 874.677268] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-096bc5d7-95de-414a-9698-3e5994de0204 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 874.686604] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Created folder: Instances in parent group-v566893. [ 874.686911] env[67169]: DEBUG oslo.service.loopingcall [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 874.687424] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 874.688108] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-32878785-45c6-4972-88fd-c0619ef980c0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 874.708973] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 874.708973] env[67169]: value = "task-2819122" [ 874.708973] env[67169]: _type = "Task" [ 874.708973] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 874.717044] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819122, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 874.823044] env[67169]: DEBUG nova.compute.manager [req-bae3f6bf-06ea-4415-a9a5-ae1030f22ec0 req-8d0a1b84-a78a-4fda-b8b9-d8aded96e9c8 service nova] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Received event network-vif-plugged-486680ca-6e09-4fb7-b37f-20f7a3c2ebb5 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 874.823225] env[67169]: DEBUG oslo_concurrency.lockutils [req-bae3f6bf-06ea-4415-a9a5-ae1030f22ec0 req-8d0a1b84-a78a-4fda-b8b9-d8aded96e9c8 service nova] Acquiring lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 874.823421] env[67169]: DEBUG oslo_concurrency.lockutils [req-bae3f6bf-06ea-4415-a9a5-ae1030f22ec0 req-8d0a1b84-a78a-4fda-b8b9-d8aded96e9c8 service nova] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 874.823591] env[67169]: DEBUG oslo_concurrency.lockutils [req-bae3f6bf-06ea-4415-a9a5-ae1030f22ec0 req-8d0a1b84-a78a-4fda-b8b9-d8aded96e9c8 service nova] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 874.823868] env[67169]: DEBUG nova.compute.manager [req-bae3f6bf-06ea-4415-a9a5-ae1030f22ec0 req-8d0a1b84-a78a-4fda-b8b9-d8aded96e9c8 service nova] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] No waiting events found dispatching network-vif-plugged-486680ca-6e09-4fb7-b37f-20f7a3c2ebb5 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 874.823917] env[67169]: WARNING nova.compute.manager [req-bae3f6bf-06ea-4415-a9a5-ae1030f22ec0 req-8d0a1b84-a78a-4fda-b8b9-d8aded96e9c8 service nova] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Received unexpected event network-vif-plugged-486680ca-6e09-4fb7-b37f-20f7a3c2ebb5 for instance with vm_state building and task_state spawning. [ 875.219559] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819122, 'name': CreateVM_Task} progress is 99%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 875.719476] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819122, 'name': CreateVM_Task} progress is 99%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 876.225954] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819122, 'name': CreateVM_Task, 'duration_secs': 1.296424} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 876.225954] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 876.225954] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 876.225954] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 876.225954] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 876.225954] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-72c631f6-38db-40b6-b8e7-0139b9c56c79 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.230282] env[67169]: DEBUG oslo_vmware.api [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Waiting for the task: (returnval){ [ 876.230282] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52abef95-e6f9-7d6f-3973-3c5f9bc1598a" [ 876.230282] env[67169]: _type = "Task" [ 876.230282] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 876.238869] env[67169]: DEBUG oslo_vmware.api [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52abef95-e6f9-7d6f-3973-3c5f9bc1598a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 876.691183] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 876.691470] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 876.739645] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 876.739942] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 876.740145] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 876.969140] env[67169]: DEBUG nova.compute.manager [req-5f977654-1bd6-4af5-b75e-30e543508a21 req-20d772e2-c4c4-4128-ab21-6573b64df255 service nova] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Received event network-changed-486680ca-6e09-4fb7-b37f-20f7a3c2ebb5 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 876.969475] env[67169]: DEBUG nova.compute.manager [req-5f977654-1bd6-4af5-b75e-30e543508a21 req-20d772e2-c4c4-4128-ab21-6573b64df255 service nova] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Refreshing instance network info cache due to event network-changed-486680ca-6e09-4fb7-b37f-20f7a3c2ebb5. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 876.969757] env[67169]: DEBUG oslo_concurrency.lockutils [req-5f977654-1bd6-4af5-b75e-30e543508a21 req-20d772e2-c4c4-4128-ab21-6573b64df255 service nova] Acquiring lock "refresh_cache-47ffcce9-3afc-41be-b38e-dacfeb535a2c" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 876.969944] env[67169]: DEBUG oslo_concurrency.lockutils [req-5f977654-1bd6-4af5-b75e-30e543508a21 req-20d772e2-c4c4-4128-ab21-6573b64df255 service nova] Acquired lock "refresh_cache-47ffcce9-3afc-41be-b38e-dacfeb535a2c" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 876.970338] env[67169]: DEBUG nova.network.neutron [req-5f977654-1bd6-4af5-b75e-30e543508a21 req-20d772e2-c4c4-4128-ab21-6573b64df255 service nova] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Refreshing network info cache for port 486680ca-6e09-4fb7-b37f-20f7a3c2ebb5 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 877.602620] env[67169]: DEBUG nova.network.neutron [req-5f977654-1bd6-4af5-b75e-30e543508a21 req-20d772e2-c4c4-4128-ab21-6573b64df255 service nova] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Updated VIF entry in instance network info cache for port 486680ca-6e09-4fb7-b37f-20f7a3c2ebb5. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 877.602984] env[67169]: DEBUG nova.network.neutron [req-5f977654-1bd6-4af5-b75e-30e543508a21 req-20d772e2-c4c4-4128-ab21-6573b64df255 service nova] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Updating instance_info_cache with network_info: [{"id": "486680ca-6e09-4fb7-b37f-20f7a3c2ebb5", "address": "fa:16:3e:2c:80:20", "network": {"id": "ed8a4d36-d0af-4797-9c09-4a9c97ca25e4", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-175679285-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f4120332e24d456cabca8ee50ceaf0d7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27681cba-790d-451e-9d12-d179871f375a", "external-id": "cl2-zone-147", "segmentation_id": 147, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap486680ca-6e", "ovs_interfaceid": "486680ca-6e09-4fb7-b37f-20f7a3c2ebb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 877.613169] env[67169]: DEBUG oslo_concurrency.lockutils [req-5f977654-1bd6-4af5-b75e-30e543508a21 req-20d772e2-c4c4-4128-ab21-6573b64df255 service nova] Releasing lock "refresh_cache-47ffcce9-3afc-41be-b38e-dacfeb535a2c" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 879.831275] env[67169]: DEBUG oslo_concurrency.lockutils [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquiring lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 883.659560] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 885.658916] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 885.659184] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 885.659311] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 885.679940] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.680124] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.680261] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.680417] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.680555] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.680681] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.680805] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.680924] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.681087] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.681178] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 885.681299] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 885.681807] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 885.692381] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 885.692611] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 885.692781] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 885.692931] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 885.693989] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b35c14f3-5a5c-4ed8-bf24-06d5a837949f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.703044] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d60be5b2-2207-4e70-8525-a7aedafab94f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.717099] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8912c5b3-9fa6-4075-a299-e5d4be9eadc7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.723774] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9125350-7cac-49c1-bea7-db7c900eb147 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.754064] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181020MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 885.754214] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 885.754401] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 885.830695] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 11e90c91-26ca-4397-81a4-975a1d714d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.830862] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.830989] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 85978a3b-052a-4a05-84e6-75c723d49bd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.831125] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e2e52693-153a-43dd-b786-dd0758caabe2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.831249] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.831368] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 28552f70-695d-40cc-8dfa-bf40d6113220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.831481] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.831601] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.831861] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.832065] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 885.843456] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.853808] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.864185] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 41ebddaf-e07d-4925-b9da-758b8e83f545 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.873642] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3c5bc03a-acc3-4601-8155-2cab101be865 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.883624] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3ea7c620-5903-410d-8ca0-68789a5e5194 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.893661] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance d0a7fdac-3e41-4539-bef4-0442bc5ad674 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.903056] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2aa47f31-4da8-4ef6-b28c-fe2a03bf8906 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.912145] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 310ae1ce-4717-4807-901c-5674677682c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.924368] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 32412c58-a231-40f7-a248-3e46fad5f5b2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.933323] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 86b2381b-676f-46fc-9317-81c0fd272069 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.952314] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance dac3617f-32fd-43c5-b8b5-fddf42d94f88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.963187] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fc12247e-bcca-4635-ba27-be1c9aeaa368 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.973589] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 885.973835] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 885.973990] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 886.257170] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11e85ebd-4bcc-4cab-96e1-ff03df51976e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 886.264894] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68ce8c67-3b7e-4d1c-a2bb-6507654e8e8c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 886.298360] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4397d4da-0398-4329-b261-084285928e67 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 886.308606] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7ced86d-526f-4a2b-9f13-de68a9e563f9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 886.329385] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 886.339808] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 886.354391] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 886.354709] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 887.332952] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 887.332952] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 887.654266] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 888.659077] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 888.659077] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 888.659077] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 888.659688] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 889.659623] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 916.709097] env[67169]: WARNING oslo_vmware.rw_handles [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 916.709097] env[67169]: ERROR oslo_vmware.rw_handles [ 916.709097] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/c1a79569-e5a4-4829-a1e3-6aab5be34d9d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 916.710880] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 916.711216] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Copying Virtual Disk [datastore2] vmware_temp/c1a79569-e5a4-4829-a1e3-6aab5be34d9d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/c1a79569-e5a4-4829-a1e3-6aab5be34d9d/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 916.711560] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4c52eeef-ddf4-4217-ae17-3a9105962bc1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.719933] env[67169]: DEBUG oslo_vmware.api [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Waiting for the task: (returnval){ [ 916.719933] env[67169]: value = "task-2819123" [ 916.719933] env[67169]: _type = "Task" [ 916.719933] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 916.728043] env[67169]: DEBUG oslo_vmware.api [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Task: {'id': task-2819123, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 917.230605] env[67169]: DEBUG oslo_vmware.exceptions [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 917.230915] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 917.231511] env[67169]: ERROR nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 917.231511] env[67169]: Faults: ['InvalidArgument'] [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Traceback (most recent call last): [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] yield resources [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] self.driver.spawn(context, instance, image_meta, [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] self._fetch_image_if_missing(context, vi) [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] image_cache(vi, tmp_image_ds_loc) [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] vm_util.copy_virtual_disk( [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] session._wait_for_task(vmdk_copy_task) [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] return self.wait_for_task(task_ref) [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] return evt.wait() [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] result = hub.switch() [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] return self.greenlet.switch() [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] self.f(*self.args, **self.kw) [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] raise exceptions.translate_fault(task_info.error) [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Faults: ['InvalidArgument'] [ 917.231511] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] [ 917.232647] env[67169]: INFO nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Terminating instance [ 917.233449] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 917.233685] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 917.233955] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-02526ae7-4219-4511-adfc-9bdeb874a971 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.236432] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 917.236622] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 917.237407] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8300ef3-e5e7-43d5-9811-f87b0084d9c0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.244807] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 917.245048] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-71e3aaf2-c667-4996-9c84-711cfb97f503 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.247348] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 917.247483] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 917.248525] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e83cac06-4df3-465e-8564-36f18a2899f6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.253429] env[67169]: DEBUG oslo_vmware.api [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Waiting for the task: (returnval){ [ 917.253429] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]521ee2ff-7f10-84a9-98b1-86bcb8a2793d" [ 917.253429] env[67169]: _type = "Task" [ 917.253429] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 917.260370] env[67169]: DEBUG oslo_vmware.api [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]521ee2ff-7f10-84a9-98b1-86bcb8a2793d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 917.314330] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 917.314550] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 917.314728] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Deleting the datastore file [datastore2] 11e90c91-26ca-4397-81a4-975a1d714d19 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 917.314998] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dca3b518-8a6b-4f79-af96-beede04fe30e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.321758] env[67169]: DEBUG oslo_vmware.api [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Waiting for the task: (returnval){ [ 917.321758] env[67169]: value = "task-2819125" [ 917.321758] env[67169]: _type = "Task" [ 917.321758] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 917.329299] env[67169]: DEBUG oslo_vmware.api [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Task: {'id': task-2819125, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 917.765236] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 917.765524] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Creating directory with path [datastore2] vmware_temp/a1c33892-d83c-43cb-84a1-01f3ad737a74/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 917.765722] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d9117d06-29eb-4adf-a38c-c2e0bec71168 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.777210] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Created directory with path [datastore2] vmware_temp/a1c33892-d83c-43cb-84a1-01f3ad737a74/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 917.777396] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Fetch image to [datastore2] vmware_temp/a1c33892-d83c-43cb-84a1-01f3ad737a74/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 917.777602] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/a1c33892-d83c-43cb-84a1-01f3ad737a74/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 917.778335] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f38823c0-d631-44ed-809c-2bc9f9aba16f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.784779] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c47e028-a784-442a-ace1-53ca4fd7408c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.793306] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0464f7d-9fec-404e-b408-06576cb1821a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.825050] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73c98f00-4110-4c34-b944-47d5992cfec6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.832297] env[67169]: DEBUG oslo_vmware.api [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Task: {'id': task-2819125, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080791} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 917.833618] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 917.833813] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 917.833982] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 917.834165] env[67169]: INFO nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Took 0.60 seconds to destroy the instance on the hypervisor. [ 917.835885] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-43484745-bf45-4b34-9979-0fe950efbdab {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.837731] env[67169]: DEBUG nova.compute.claims [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 917.837921] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 917.838154] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 917.860424] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 917.911037] env[67169]: DEBUG oslo_vmware.rw_handles [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a1c33892-d83c-43cb-84a1-01f3ad737a74/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 917.970641] env[67169]: DEBUG oslo_vmware.rw_handles [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 917.970863] env[67169]: DEBUG oslo_vmware.rw_handles [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a1c33892-d83c-43cb-84a1-01f3ad737a74/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 918.198166] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2a9b8fa-43a0-4ee9-b11a-718433e0f78d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.211972] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c31244e-2d87-4d7c-b07b-ebf684a3cf68 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.247586] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88e3b2d8-0a93-4c27-9482-214b0bf5343d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.253488] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a47f37b-c607-417e-8285-4fe26b324946 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.269052] env[67169]: DEBUG nova.compute.provider_tree [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 918.277602] env[67169]: DEBUG nova.scheduler.client.report [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 918.292695] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.454s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 918.293237] env[67169]: ERROR nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 918.293237] env[67169]: Faults: ['InvalidArgument'] [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Traceback (most recent call last): [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] self.driver.spawn(context, instance, image_meta, [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] self._fetch_image_if_missing(context, vi) [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] image_cache(vi, tmp_image_ds_loc) [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] vm_util.copy_virtual_disk( [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] session._wait_for_task(vmdk_copy_task) [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] return self.wait_for_task(task_ref) [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] return evt.wait() [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] result = hub.switch() [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] return self.greenlet.switch() [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] self.f(*self.args, **self.kw) [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] raise exceptions.translate_fault(task_info.error) [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Faults: ['InvalidArgument'] [ 918.293237] env[67169]: ERROR nova.compute.manager [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] [ 918.294267] env[67169]: DEBUG nova.compute.utils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 918.295367] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Build of instance 11e90c91-26ca-4397-81a4-975a1d714d19 was re-scheduled: A specified parameter was not correct: fileType [ 918.295367] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 918.295746] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 918.295942] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 918.296139] env[67169]: DEBUG nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 918.296308] env[67169]: DEBUG nova.network.neutron [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 918.737204] env[67169]: DEBUG nova.network.neutron [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 918.750858] env[67169]: INFO nova.compute.manager [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Took 0.45 seconds to deallocate network for instance. [ 918.864250] env[67169]: INFO nova.scheduler.client.report [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Deleted allocations for instance 11e90c91-26ca-4397-81a4-975a1d714d19 [ 918.889966] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3c83de1c-bff9-4913-ac4c-cf997776c0d4 tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "11e90c91-26ca-4397-81a4-975a1d714d19" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 338.461s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 918.891345] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "11e90c91-26ca-4397-81a4-975a1d714d19" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 333.205s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 918.891744] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] During sync_power_state the instance has a pending task (spawning). Skip. [ 918.891831] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "11e90c91-26ca-4397-81a4-975a1d714d19" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 918.892532] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "11e90c91-26ca-4397-81a4-975a1d714d19" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 137.498s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 918.893123] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Acquiring lock "11e90c91-26ca-4397-81a4-975a1d714d19-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 918.893411] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "11e90c91-26ca-4397-81a4-975a1d714d19-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 918.893627] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "11e90c91-26ca-4397-81a4-975a1d714d19-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 918.895506] env[67169]: INFO nova.compute.manager [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Terminating instance [ 918.897662] env[67169]: DEBUG nova.compute.manager [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 918.897902] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 918.898270] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ba94de9c-75a5-4cf1-ab38-3236ec095080 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.902330] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 918.909082] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f4c0a8c-51c6-451e-ad3e-3cdb82f4fcb8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.939784] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 11e90c91-26ca-4397-81a4-975a1d714d19 could not be found. [ 918.939784] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 918.939784] env[67169]: INFO nova.compute.manager [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Took 0.04 seconds to destroy the instance on the hypervisor. [ 918.939784] env[67169]: DEBUG oslo.service.loopingcall [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 918.939784] env[67169]: DEBUG nova.compute.manager [-] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 918.939784] env[67169]: DEBUG nova.network.neutron [-] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 918.958721] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 918.958972] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 918.960474] env[67169]: INFO nova.compute.claims [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 918.973472] env[67169]: DEBUG nova.network.neutron [-] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 918.982126] env[67169]: INFO nova.compute.manager [-] [instance: 11e90c91-26ca-4397-81a4-975a1d714d19] Took 0.04 seconds to deallocate network for instance. [ 919.066864] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fd272883-7a6e-4ce0-a77d-b8b9b5567e0a tempest-ServersTestFqdnHostnames-427175016 tempest-ServersTestFqdnHostnames-427175016-project-member] Lock "11e90c91-26ca-4397-81a4-975a1d714d19" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 919.264418] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-615c98c2-b8df-4963-926b-a3bb13070556 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.272553] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-916081c0-6b7c-43e0-90a1-818aaaf7f9b6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.302447] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e082413c-2a28-4636-ab5d-a45ec70a07fd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.309239] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa11b3f3-2a7c-4c93-bc4e-3ec3f480bb74 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.322580] env[67169]: DEBUG nova.compute.provider_tree [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 919.331749] env[67169]: DEBUG nova.scheduler.client.report [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 919.344715] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.386s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 919.345151] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 919.380087] env[67169]: DEBUG nova.compute.utils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 919.381563] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 919.381734] env[67169]: DEBUG nova.network.neutron [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 919.390758] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 919.461917] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 919.491106] env[67169]: DEBUG nova.virt.hardware [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 919.491741] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d09c5a1-da49-41a2-bfe3-f19c0cdbfabf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.498955] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0584e2a-5862-434d-8f78-43a007577b74 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 919.505271] env[67169]: DEBUG nova.policy [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '91c0570c9cec4ba2a7d248a66b2f70d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '202ec4287c3042809c86951050621ffc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 920.076526] env[67169]: DEBUG nova.network.neutron [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Successfully created port: b1c4485b-dfda-40c4-88ee-b860f0a189ab {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 921.286805] env[67169]: DEBUG nova.network.neutron [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Successfully updated port: b1c4485b-dfda-40c4-88ee-b860f0a189ab {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 921.301643] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "refresh_cache-1f0f1960-0c77-4e72-86ee-807819e75d2a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 921.301643] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquired lock "refresh_cache-1f0f1960-0c77-4e72-86ee-807819e75d2a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 921.301643] env[67169]: DEBUG nova.network.neutron [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 921.370331] env[67169]: DEBUG nova.network.neutron [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 921.473874] env[67169]: DEBUG nova.compute.manager [req-0f411aee-cbec-468f-a7a9-ebe31134bdf4 req-f6c88b0c-fd6c-47e0-844b-929e3577c875 service nova] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Received event network-vif-plugged-b1c4485b-dfda-40c4-88ee-b860f0a189ab {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 921.474114] env[67169]: DEBUG oslo_concurrency.lockutils [req-0f411aee-cbec-468f-a7a9-ebe31134bdf4 req-f6c88b0c-fd6c-47e0-844b-929e3577c875 service nova] Acquiring lock "1f0f1960-0c77-4e72-86ee-807819e75d2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 921.474381] env[67169]: DEBUG oslo_concurrency.lockutils [req-0f411aee-cbec-468f-a7a9-ebe31134bdf4 req-f6c88b0c-fd6c-47e0-844b-929e3577c875 service nova] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 921.474490] env[67169]: DEBUG oslo_concurrency.lockutils [req-0f411aee-cbec-468f-a7a9-ebe31134bdf4 req-f6c88b0c-fd6c-47e0-844b-929e3577c875 service nova] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 921.474659] env[67169]: DEBUG nova.compute.manager [req-0f411aee-cbec-468f-a7a9-ebe31134bdf4 req-f6c88b0c-fd6c-47e0-844b-929e3577c875 service nova] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] No waiting events found dispatching network-vif-plugged-b1c4485b-dfda-40c4-88ee-b860f0a189ab {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 921.474828] env[67169]: WARNING nova.compute.manager [req-0f411aee-cbec-468f-a7a9-ebe31134bdf4 req-f6c88b0c-fd6c-47e0-844b-929e3577c875 service nova] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Received unexpected event network-vif-plugged-b1c4485b-dfda-40c4-88ee-b860f0a189ab for instance with vm_state building and task_state spawning. [ 921.663659] env[67169]: DEBUG nova.network.neutron [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Updating instance_info_cache with network_info: [{"id": "b1c4485b-dfda-40c4-88ee-b860f0a189ab", "address": "fa:16:3e:f2:d1:7f", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb1c4485b-df", "ovs_interfaceid": "b1c4485b-dfda-40c4-88ee-b860f0a189ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 921.679527] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Releasing lock "refresh_cache-1f0f1960-0c77-4e72-86ee-807819e75d2a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 921.679886] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Instance network_info: |[{"id": "b1c4485b-dfda-40c4-88ee-b860f0a189ab", "address": "fa:16:3e:f2:d1:7f", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb1c4485b-df", "ovs_interfaceid": "b1c4485b-dfda-40c4-88ee-b860f0a189ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 921.680301] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f2:d1:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c9bc2632-36f9-4912-8782-8bbb789f909d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b1c4485b-dfda-40c4-88ee-b860f0a189ab', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 921.687899] env[67169]: DEBUG oslo.service.loopingcall [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 921.688673] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 921.689081] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1abd26d9-d8fc-468f-92b5-2de92125814b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 921.709745] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 921.709745] env[67169]: value = "task-2819126" [ 921.709745] env[67169]: _type = "Task" [ 921.709745] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 921.718236] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819126, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 922.219718] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819126, 'name': CreateVM_Task} progress is 25%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 922.722040] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819126, 'name': CreateVM_Task, 'duration_secs': 0.637477} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 922.722040] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 922.722040] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 922.722040] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 922.722753] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 922.724018] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f5889458-acb5-4db4-9bde-cdf1b61fba01 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 922.728964] env[67169]: DEBUG oslo_vmware.api [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for the task: (returnval){ [ 922.728964] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52c80426-a706-8483-ab57-3eba27f2d423" [ 922.728964] env[67169]: _type = "Task" [ 922.728964] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 922.739112] env[67169]: DEBUG oslo_vmware.api [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52c80426-a706-8483-ab57-3eba27f2d423, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 923.240475] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 923.242382] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 923.242382] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 923.396335] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "1f0f1960-0c77-4e72-86ee-807819e75d2a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 923.645284] env[67169]: DEBUG nova.compute.manager [req-6380ab42-3dc2-49a5-9728-f2d0245bf69f req-2812a4df-f395-478c-9efb-5051c1c97cae service nova] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Received event network-changed-b1c4485b-dfda-40c4-88ee-b860f0a189ab {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 923.645527] env[67169]: DEBUG nova.compute.manager [req-6380ab42-3dc2-49a5-9728-f2d0245bf69f req-2812a4df-f395-478c-9efb-5051c1c97cae service nova] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Refreshing instance network info cache due to event network-changed-b1c4485b-dfda-40c4-88ee-b860f0a189ab. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 923.646216] env[67169]: DEBUG oslo_concurrency.lockutils [req-6380ab42-3dc2-49a5-9728-f2d0245bf69f req-2812a4df-f395-478c-9efb-5051c1c97cae service nova] Acquiring lock "refresh_cache-1f0f1960-0c77-4e72-86ee-807819e75d2a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 923.646216] env[67169]: DEBUG oslo_concurrency.lockutils [req-6380ab42-3dc2-49a5-9728-f2d0245bf69f req-2812a4df-f395-478c-9efb-5051c1c97cae service nova] Acquired lock "refresh_cache-1f0f1960-0c77-4e72-86ee-807819e75d2a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 923.646216] env[67169]: DEBUG nova.network.neutron [req-6380ab42-3dc2-49a5-9728-f2d0245bf69f req-2812a4df-f395-478c-9efb-5051c1c97cae service nova] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Refreshing network info cache for port b1c4485b-dfda-40c4-88ee-b860f0a189ab {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 924.295808] env[67169]: DEBUG nova.network.neutron [req-6380ab42-3dc2-49a5-9728-f2d0245bf69f req-2812a4df-f395-478c-9efb-5051c1c97cae service nova] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Updated VIF entry in instance network info cache for port b1c4485b-dfda-40c4-88ee-b860f0a189ab. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 924.296217] env[67169]: DEBUG nova.network.neutron [req-6380ab42-3dc2-49a5-9728-f2d0245bf69f req-2812a4df-f395-478c-9efb-5051c1c97cae service nova] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Updating instance_info_cache with network_info: [{"id": "b1c4485b-dfda-40c4-88ee-b860f0a189ab", "address": "fa:16:3e:f2:d1:7f", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb1c4485b-df", "ovs_interfaceid": "b1c4485b-dfda-40c4-88ee-b860f0a189ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 924.318633] env[67169]: DEBUG oslo_concurrency.lockutils [req-6380ab42-3dc2-49a5-9728-f2d0245bf69f req-2812a4df-f395-478c-9efb-5051c1c97cae service nova] Releasing lock "refresh_cache-1f0f1960-0c77-4e72-86ee-807819e75d2a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 928.044858] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "7bf839c0-3ec8-4329-823d-de1fae4833cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 928.044858] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 943.659595] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 946.658662] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 946.658917] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 946.658917] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 946.681496] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.681666] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.681804] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.681935] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.682087] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.682225] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.682348] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.682470] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.682588] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.682706] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 946.682823] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 946.683348] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 946.683495] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 946.683657] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 946.695042] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 946.695233] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 946.695401] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 946.695553] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 946.696613] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9642638-c544-4f9e-959b-8af46d96f2ab {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.705329] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a98fa858-6473-4a05-a4dc-e4cb1a48c7ae {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.719220] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53ca6906-67b5-44fc-bed6-906d553b0b1e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.725661] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96857969-d7cf-4629-99fe-72581e3b24f5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.755330] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181051MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 946.755475] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 946.755665] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 946.828098] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.828260] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 85978a3b-052a-4a05-84e6-75c723d49bd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.828392] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e2e52693-153a-43dd-b786-dd0758caabe2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.828518] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.828640] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 28552f70-695d-40cc-8dfa-bf40d6113220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.828760] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.828878] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.828993] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.829119] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.829233] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 946.840940] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.851535] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 41ebddaf-e07d-4925-b9da-758b8e83f545 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.861603] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3c5bc03a-acc3-4601-8155-2cab101be865 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.871444] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3ea7c620-5903-410d-8ca0-68789a5e5194 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.880773] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance d0a7fdac-3e41-4539-bef4-0442bc5ad674 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.892723] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2aa47f31-4da8-4ef6-b28c-fe2a03bf8906 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.901566] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 310ae1ce-4717-4807-901c-5674677682c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.911446] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 32412c58-a231-40f7-a248-3e46fad5f5b2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.920209] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 86b2381b-676f-46fc-9317-81c0fd272069 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.928716] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance dac3617f-32fd-43c5-b8b5-fddf42d94f88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.937274] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fc12247e-bcca-4635-ba27-be1c9aeaa368 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.946235] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.955275] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 946.955507] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 946.955654] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 947.198989] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec45eb5c-4088-4772-871a-6fe9e533cfc1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.206477] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1de8499a-41bf-48e0-8631-532cb06b7fbd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.239559] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cb3b0ca-ded0-4721-88aa-5d25fc8da8a2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.247236] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64d6315e-97cb-4f1e-b115-56a2ceae3e1d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.260228] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 947.269213] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 947.284178] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 947.284399] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.529s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 949.261031] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 949.261382] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 949.658761] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 950.658581] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 950.658964] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 966.723686] env[67169]: WARNING oslo_vmware.rw_handles [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 966.723686] env[67169]: ERROR oslo_vmware.rw_handles [ 966.724543] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/a1c33892-d83c-43cb-84a1-01f3ad737a74/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 966.725857] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 966.726130] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Copying Virtual Disk [datastore2] vmware_temp/a1c33892-d83c-43cb-84a1-01f3ad737a74/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/a1c33892-d83c-43cb-84a1-01f3ad737a74/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 966.726418] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bac8e9f1-04b7-435d-ad31-7a8331399d1e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 966.734873] env[67169]: DEBUG oslo_vmware.api [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Waiting for the task: (returnval){ [ 966.734873] env[67169]: value = "task-2819127" [ 966.734873] env[67169]: _type = "Task" [ 966.734873] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 966.742900] env[67169]: DEBUG oslo_vmware.api [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Task: {'id': task-2819127, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 967.244415] env[67169]: DEBUG oslo_vmware.exceptions [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 967.244702] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 967.245309] env[67169]: ERROR nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 967.245309] env[67169]: Faults: ['InvalidArgument'] [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Traceback (most recent call last): [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] yield resources [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] self.driver.spawn(context, instance, image_meta, [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] self._fetch_image_if_missing(context, vi) [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] image_cache(vi, tmp_image_ds_loc) [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] vm_util.copy_virtual_disk( [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] session._wait_for_task(vmdk_copy_task) [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] return self.wait_for_task(task_ref) [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] return evt.wait() [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] result = hub.switch() [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] return self.greenlet.switch() [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] self.f(*self.args, **self.kw) [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] raise exceptions.translate_fault(task_info.error) [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Faults: ['InvalidArgument'] [ 967.245309] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] [ 967.246390] env[67169]: INFO nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Terminating instance [ 967.247169] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 967.247402] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 967.247625] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-334a609a-bd10-4c4c-aa98-2956dba1a17d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.249800] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 967.250838] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 967.250838] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75463f4e-a45e-47f2-9b51-488d551e2a90 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.257666] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 967.257892] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1ab129e6-58b8-429c-a1ab-e57424baf5c7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.260167] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 967.260356] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 967.261324] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d2ca753-b0b3-4d2f-b6d2-d518a90e9edf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.265986] env[67169]: DEBUG oslo_vmware.api [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 967.265986] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52990efe-8bb2-d440-9f14-b4c85d0aab6b" [ 967.265986] env[67169]: _type = "Task" [ 967.265986] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 967.273172] env[67169]: DEBUG oslo_vmware.api [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52990efe-8bb2-d440-9f14-b4c85d0aab6b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 967.326171] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 967.326425] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 967.326689] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Deleting the datastore file [datastore2] 835bf8da-8d8f-4dfd-b0a9-fab02796f39e {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 967.326987] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f31e335a-a1da-4a75-a8a7-dabde4a4633d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.332836] env[67169]: DEBUG oslo_vmware.api [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Waiting for the task: (returnval){ [ 967.332836] env[67169]: value = "task-2819129" [ 967.332836] env[67169]: _type = "Task" [ 967.332836] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 967.342626] env[67169]: DEBUG oslo_vmware.api [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Task: {'id': task-2819129, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 967.776873] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 967.777213] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating directory with path [datastore2] vmware_temp/99155099-bdd2-483d-b8ea-a97dc199a30e/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 967.777402] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-834b5dc1-7df3-4ee8-adee-53ed29e2f96e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.788425] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created directory with path [datastore2] vmware_temp/99155099-bdd2-483d-b8ea-a97dc199a30e/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 967.788575] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Fetch image to [datastore2] vmware_temp/99155099-bdd2-483d-b8ea-a97dc199a30e/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 967.788748] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/99155099-bdd2-483d-b8ea-a97dc199a30e/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 967.789483] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f269b55-3782-45a6-a244-20c09d262c97 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.795911] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48d9e20a-35ea-40fb-9348-7f5c9cce0595 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.804674] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95ba191-2762-4d9b-8a9a-aff812b3d17e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.838103] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4cb6e8d-0c54-457d-98d5-8e9e5c495c36 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.845235] env[67169]: DEBUG oslo_vmware.api [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Task: {'id': task-2819129, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.0787} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 967.846661] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 967.846855] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 967.847036] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 967.847215] env[67169]: INFO nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 967.849374] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5d25f47c-5083-41fe-b0f6-5d216e0b1758 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.851286] env[67169]: DEBUG nova.compute.claims [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 967.851456] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 967.851672] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 967.873486] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 967.930451] env[67169]: DEBUG oslo_vmware.rw_handles [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/99155099-bdd2-483d-b8ea-a97dc199a30e/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 968.061234] env[67169]: DEBUG oslo_vmware.rw_handles [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 968.061234] env[67169]: DEBUG oslo_vmware.rw_handles [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/99155099-bdd2-483d-b8ea-a97dc199a30e/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 968.229073] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f86fb261-5cea-4b9b-b379-36b81a78b427 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.236845] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63b60a7c-81e6-425e-9ce1-f75db8de67e8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.266791] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af23295f-6cfa-4540-b3dd-56bd7b22612f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.274055] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f35c1c-d178-44ad-a426-994b05421f4b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.286909] env[67169]: DEBUG nova.compute.provider_tree [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 968.299036] env[67169]: DEBUG nova.scheduler.client.report [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 968.313778] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.462s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 968.314019] env[67169]: ERROR nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 968.314019] env[67169]: Faults: ['InvalidArgument'] [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Traceback (most recent call last): [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] self.driver.spawn(context, instance, image_meta, [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] self._fetch_image_if_missing(context, vi) [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] image_cache(vi, tmp_image_ds_loc) [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] vm_util.copy_virtual_disk( [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] session._wait_for_task(vmdk_copy_task) [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] return self.wait_for_task(task_ref) [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] return evt.wait() [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] result = hub.switch() [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] return self.greenlet.switch() [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] self.f(*self.args, **self.kw) [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] raise exceptions.translate_fault(task_info.error) [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Faults: ['InvalidArgument'] [ 968.314019] env[67169]: ERROR nova.compute.manager [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] [ 968.315402] env[67169]: DEBUG nova.compute.utils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 968.316076] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Build of instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e was re-scheduled: A specified parameter was not correct: fileType [ 968.316076] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 968.316441] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 968.316611] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 968.316779] env[67169]: DEBUG nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 968.316941] env[67169]: DEBUG nova.network.neutron [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 968.765985] env[67169]: DEBUG nova.network.neutron [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 968.780691] env[67169]: INFO nova.compute.manager [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Took 0.46 seconds to deallocate network for instance. [ 968.888190] env[67169]: INFO nova.scheduler.client.report [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Deleted allocations for instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e [ 968.909031] env[67169]: DEBUG oslo_concurrency.lockutils [None req-96da241f-f17f-4952-914f-45ae126f7058 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 385.043s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 968.910335] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 383.224s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 968.910528] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] During sync_power_state the instance has a pending task (spawning). Skip. [ 968.910704] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 968.911336] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 186.145s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 968.911551] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Acquiring lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 968.911756] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 968.911917] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 968.914087] env[67169]: INFO nova.compute.manager [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Terminating instance [ 968.915884] env[67169]: DEBUG nova.compute.manager [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 968.917046] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 968.917368] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-02b0ebb6-6d56-4604-9de4-9c9d257eff30 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.920533] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 968.930306] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acd881ec-2b64-4e09-ae08-e4c76cc91ea3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.963029] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 835bf8da-8d8f-4dfd-b0a9-fab02796f39e could not be found. [ 968.963261] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 968.963443] env[67169]: INFO nova.compute.manager [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Took 0.05 seconds to destroy the instance on the hypervisor. [ 968.963693] env[67169]: DEBUG oslo.service.loopingcall [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 968.965954] env[67169]: DEBUG nova.compute.manager [-] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 968.966078] env[67169]: DEBUG nova.network.neutron [-] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 968.979438] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 968.979670] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 968.981210] env[67169]: INFO nova.compute.claims [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 969.000993] env[67169]: DEBUG nova.network.neutron [-] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 969.018142] env[67169]: INFO nova.compute.manager [-] [instance: 835bf8da-8d8f-4dfd-b0a9-fab02796f39e] Took 0.05 seconds to deallocate network for instance. [ 969.108182] env[67169]: DEBUG oslo_concurrency.lockutils [None req-48dea7cb-59e8-484a-8988-68cd72405f44 tempest-FloatingIPsAssociationNegativeTestJSON-216368034 tempest-FloatingIPsAssociationNegativeTestJSON-216368034-project-member] Lock "835bf8da-8d8f-4dfd-b0a9-fab02796f39e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.197s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 969.298526] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58c3f9f3-9dcf-49a6-98b6-d9f6af3d8ff1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.305922] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e68dc28a-0500-479b-8f6c-f9b987ed3184 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.336029] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b066f84d-db9c-47a4-ac39-49bc03197264 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.343151] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b782d8-002a-4f3d-9cfe-6b87f08a93e7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.360303] env[67169]: DEBUG nova.compute.provider_tree [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 969.374309] env[67169]: DEBUG nova.scheduler.client.report [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 969.392828] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.413s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 969.393484] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 969.430983] env[67169]: DEBUG nova.compute.utils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 969.433759] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 969.433759] env[67169]: DEBUG nova.network.neutron [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 969.441782] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 969.508199] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 969.522623] env[67169]: DEBUG nova.policy [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ce57069286b34b5da298e9b01f4bd39e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d3275803e654637b85c8f15583e2e25', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 969.532688] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 969.532945] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 969.533122] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 969.533312] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 969.533460] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 969.533605] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 969.533809] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 969.533962] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 969.534144] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 969.534348] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 969.534536] env[67169]: DEBUG nova.virt.hardware [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 969.535393] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de3d51fc-92e4-4261-8107-18db0b12f618 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.543110] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e49a5790-1193-4575-9f54-535f1fc92919 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 970.063676] env[67169]: DEBUG nova.network.neutron [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Successfully created port: 7febfc79-1e3a-4e66-974b-ebd78be598ce {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 971.410585] env[67169]: DEBUG nova.network.neutron [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Successfully updated port: 7febfc79-1e3a-4e66-974b-ebd78be598ce {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 971.422245] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "refresh_cache-ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 971.422403] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired lock "refresh_cache-ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 971.422637] env[67169]: DEBUG nova.network.neutron [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 971.500218] env[67169]: DEBUG nova.network.neutron [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 971.678652] env[67169]: DEBUG nova.compute.manager [req-347be988-b581-44f2-b143-dbe726caede2 req-b3948e9e-cb20-4090-a541-e653b773c621 service nova] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Received event network-vif-plugged-7febfc79-1e3a-4e66-974b-ebd78be598ce {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 971.679293] env[67169]: DEBUG oslo_concurrency.lockutils [req-347be988-b581-44f2-b143-dbe726caede2 req-b3948e9e-cb20-4090-a541-e653b773c621 service nova] Acquiring lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 971.679293] env[67169]: DEBUG oslo_concurrency.lockutils [req-347be988-b581-44f2-b143-dbe726caede2 req-b3948e9e-cb20-4090-a541-e653b773c621 service nova] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 971.679293] env[67169]: DEBUG oslo_concurrency.lockutils [req-347be988-b581-44f2-b143-dbe726caede2 req-b3948e9e-cb20-4090-a541-e653b773c621 service nova] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 971.679646] env[67169]: DEBUG nova.compute.manager [req-347be988-b581-44f2-b143-dbe726caede2 req-b3948e9e-cb20-4090-a541-e653b773c621 service nova] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] No waiting events found dispatching network-vif-plugged-7febfc79-1e3a-4e66-974b-ebd78be598ce {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 971.681635] env[67169]: WARNING nova.compute.manager [req-347be988-b581-44f2-b143-dbe726caede2 req-b3948e9e-cb20-4090-a541-e653b773c621 service nova] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Received unexpected event network-vif-plugged-7febfc79-1e3a-4e66-974b-ebd78be598ce for instance with vm_state building and task_state spawning. [ 971.899881] env[67169]: DEBUG nova.network.neutron [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Updating instance_info_cache with network_info: [{"id": "7febfc79-1e3a-4e66-974b-ebd78be598ce", "address": "fa:16:3e:d9:9a:c6", "network": {"id": "4e24bc87-3a15-4231-a607-f93bb9122dca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-93817792-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3275803e654637b85c8f15583e2e25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7febfc79-1e", "ovs_interfaceid": "7febfc79-1e3a-4e66-974b-ebd78be598ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 971.912855] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Releasing lock "refresh_cache-ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 971.913323] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Instance network_info: |[{"id": "7febfc79-1e3a-4e66-974b-ebd78be598ce", "address": "fa:16:3e:d9:9a:c6", "network": {"id": "4e24bc87-3a15-4231-a607-f93bb9122dca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-93817792-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3275803e654637b85c8f15583e2e25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7febfc79-1e", "ovs_interfaceid": "7febfc79-1e3a-4e66-974b-ebd78be598ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 971.914246] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d9:9a:c6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '298bb8ef-4765-494c-b157-7a349218bd1e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7febfc79-1e3a-4e66-974b-ebd78be598ce', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 971.926224] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating folder: Project (3d3275803e654637b85c8f15583e2e25). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 971.926904] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cb0d10f5-adb3-4307-9d8a-b05f73daa82c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 971.940143] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Created folder: Project (3d3275803e654637b85c8f15583e2e25) in parent group-v566843. [ 971.940937] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating folder: Instances. Parent ref: group-v566897. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 971.941181] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c56b5148-abaa-4639-9296-5cd4b66a4494 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 971.951589] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Created folder: Instances in parent group-v566897. [ 971.952020] env[67169]: DEBUG oslo.service.loopingcall [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 971.952020] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 971.952307] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-741956d3-0892-46b1-850d-7826ab132578 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 971.974460] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 971.974460] env[67169]: value = "task-2819132" [ 971.974460] env[67169]: _type = "Task" [ 971.974460] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 971.982537] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819132, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 972.486049] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819132, 'name': CreateVM_Task, 'duration_secs': 0.324749} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 972.486352] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 972.486845] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 972.487018] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 972.487345] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 972.487602] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-67bcaee2-f170-49ac-9709-83873a40d040 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 972.492557] env[67169]: DEBUG oslo_vmware.api [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 972.492557] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]522cd8b4-77a6-3e3b-fdf0-84abc74ed2f4" [ 972.492557] env[67169]: _type = "Task" [ 972.492557] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 972.500345] env[67169]: DEBUG oslo_vmware.api [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]522cd8b4-77a6-3e3b-fdf0-84abc74ed2f4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 973.004985] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 973.005145] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 973.005389] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 973.891221] env[67169]: DEBUG nova.compute.manager [req-ddfbeb0b-db49-4226-92b5-5479a14c4f90 req-afc3b04f-8b3d-4a25-b58a-438463e75abf service nova] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Received event network-changed-7febfc79-1e3a-4e66-974b-ebd78be598ce {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 973.891629] env[67169]: DEBUG nova.compute.manager [req-ddfbeb0b-db49-4226-92b5-5479a14c4f90 req-afc3b04f-8b3d-4a25-b58a-438463e75abf service nova] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Refreshing instance network info cache due to event network-changed-7febfc79-1e3a-4e66-974b-ebd78be598ce. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 973.891883] env[67169]: DEBUG oslo_concurrency.lockutils [req-ddfbeb0b-db49-4226-92b5-5479a14c4f90 req-afc3b04f-8b3d-4a25-b58a-438463e75abf service nova] Acquiring lock "refresh_cache-ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 973.892156] env[67169]: DEBUG oslo_concurrency.lockutils [req-ddfbeb0b-db49-4226-92b5-5479a14c4f90 req-afc3b04f-8b3d-4a25-b58a-438463e75abf service nova] Acquired lock "refresh_cache-ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 973.892439] env[67169]: DEBUG nova.network.neutron [req-ddfbeb0b-db49-4226-92b5-5479a14c4f90 req-afc3b04f-8b3d-4a25-b58a-438463e75abf service nova] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Refreshing network info cache for port 7febfc79-1e3a-4e66-974b-ebd78be598ce {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 974.380578] env[67169]: DEBUG nova.network.neutron [req-ddfbeb0b-db49-4226-92b5-5479a14c4f90 req-afc3b04f-8b3d-4a25-b58a-438463e75abf service nova] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Updated VIF entry in instance network info cache for port 7febfc79-1e3a-4e66-974b-ebd78be598ce. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 974.380918] env[67169]: DEBUG nova.network.neutron [req-ddfbeb0b-db49-4226-92b5-5479a14c4f90 req-afc3b04f-8b3d-4a25-b58a-438463e75abf service nova] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Updating instance_info_cache with network_info: [{"id": "7febfc79-1e3a-4e66-974b-ebd78be598ce", "address": "fa:16:3e:d9:9a:c6", "network": {"id": "4e24bc87-3a15-4231-a607-f93bb9122dca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-93817792-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3275803e654637b85c8f15583e2e25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7febfc79-1e", "ovs_interfaceid": "7febfc79-1e3a-4e66-974b-ebd78be598ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 974.391822] env[67169]: DEBUG oslo_concurrency.lockutils [req-ddfbeb0b-db49-4226-92b5-5479a14c4f90 req-afc3b04f-8b3d-4a25-b58a-438463e75abf service nova] Releasing lock "refresh_cache-ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 977.346921] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquiring lock "bab5d630-fec0-44e5-8088-12c8855aad66" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 977.347458] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "bab5d630-fec0-44e5-8088-12c8855aad66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 981.654536] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 981.755166] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "a86fa702-2040-4e22-9eaa-5d64bc16f036" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 981.755395] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 994.831936] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f7e059ad-cf85-4f1e-9612-3b3d088114e3 tempest-ServerAddressesNegativeTestJSON-244048613 tempest-ServerAddressesNegativeTestJSON-244048613-project-member] Acquiring lock "ca657a42-3745-46e1-8fc9-61de31f661d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 994.831936] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f7e059ad-cf85-4f1e-9612-3b3d088114e3 tempest-ServerAddressesNegativeTestJSON-244048613 tempest-ServerAddressesNegativeTestJSON-244048613-project-member] Lock "ca657a42-3745-46e1-8fc9-61de31f661d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 996.028894] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5678a70a-1a33-4b69-862c-ad483d1e90f6 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquiring lock "3f90c9a4-650d-4280-b155-1315d2f0f281" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 996.028894] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5678a70a-1a33-4b69-862c-ad483d1e90f6 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "3f90c9a4-650d-4280-b155-1315d2f0f281" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 998.997488] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a3591b6f-cb28-42b6-8fac-da83059b80ed tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] Acquiring lock "d964ad35-8d3f-45f3-b799-aebddf295012" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 998.998020] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a3591b6f-cb28-42b6-8fac-da83059b80ed tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] Lock "d964ad35-8d3f-45f3-b799-aebddf295012" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.742153] env[67169]: DEBUG oslo_concurrency.lockutils [None req-90998c77-a1b7-454e-8f72-88bd859f39b9 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] Acquiring lock "0b78afae-71e9-4ba9-903a-03c8a98cd91e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 999.742857] env[67169]: DEBUG oslo_concurrency.lockutils [None req-90998c77-a1b7-454e-8f72-88bd859f39b9 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] Lock "0b78afae-71e9-4ba9-903a-03c8a98cd91e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1000.662985] env[67169]: DEBUG oslo_concurrency.lockutils [None req-18d319a2-8ae3-447e-a465-6e3b089fcdc4 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] Acquiring lock "54b1337f-4ac8-4718-b273-2f078782b491" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.663721] env[67169]: DEBUG oslo_concurrency.lockutils [None req-18d319a2-8ae3-447e-a465-6e3b089fcdc4 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] Lock "54b1337f-4ac8-4718-b273-2f078782b491" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1003.659768] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1007.658593] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1007.658855] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1008.654607] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1008.654828] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1008.682549] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1008.682549] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1008.682549] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1008.715022] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1008.715022] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1008.715489] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1008.731968] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1008.732788] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1008.732788] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1008.732788] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1008.734570] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d822b772-6b73-4985-b44f-4e68cc8c735a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.747508] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5ba8ade-dd66-424e-a9c8-c060101c2c4a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.764668] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deeae689-5ad5-4892-84fb-e09c69047066 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.771888] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2153f750-d926-477d-b966-e46a00bbc959 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1008.806213] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181021MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1008.806213] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1008.806213] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1008.916449] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 85978a3b-052a-4a05-84e6-75c723d49bd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.916674] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance e2e52693-153a-43dd-b786-dd0758caabe2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.916809] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.916933] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 28552f70-695d-40cc-8dfa-bf40d6113220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.917218] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.917218] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.917504] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.917504] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.917504] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.917669] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1008.932188] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2aa47f31-4da8-4ef6-b28c-fe2a03bf8906 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.945410] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 310ae1ce-4717-4807-901c-5674677682c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.956839] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 32412c58-a231-40f7-a248-3e46fad5f5b2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.968295] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 86b2381b-676f-46fc-9317-81c0fd272069 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.985731] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance dac3617f-32fd-43c5-b8b5-fddf42d94f88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1008.992948] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fc12247e-bcca-4635-ba27-be1c9aeaa368 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.006243] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.018433] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.031472] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.044589] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.056639] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ca657a42-3745-46e1-8fc9-61de31f661d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.074070] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3f90c9a4-650d-4280-b155-1315d2f0f281 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.086613] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance d964ad35-8d3f-45f3-b799-aebddf295012 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.102031] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0b78afae-71e9-4ba9-903a-03c8a98cd91e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.113282] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 54b1337f-4ac8-4718-b273-2f078782b491 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1009.113501] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1009.113653] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1009.295754] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e7183d51-1057-45cf-8e24-24eae5cf8020 tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] Acquiring lock "bf6857fb-2088-4e2c-b1a4-4c4b631f0153" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1009.296000] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e7183d51-1057-45cf-8e24-24eae5cf8020 tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] Lock "bf6857fb-2088-4e2c-b1a4-4c4b631f0153" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1009.543118] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a47d5a11-ec0b-457d-91b2-cf5cb29a1e3e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.551280] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca891c99-a5d5-4d37-acda-d7e6c288dc93 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.583231] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79095245-73f6-4933-96aa-6f9b0807d252 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.590730] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acbabeed-c826-417e-b096-0e226605fcfe {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.609073] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1009.619133] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1009.635302] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1009.635510] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.831s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1011.542612] env[67169]: DEBUG oslo_concurrency.lockutils [None req-85a1e9a2-871f-47fc-bc64-588687eba07f tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] Acquiring lock "cdca51b4-b059-48b6-ae81-ced1a447f10d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1011.542939] env[67169]: DEBUG oslo_concurrency.lockutils [None req-85a1e9a2-871f-47fc-bc64-588687eba07f tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] Lock "cdca51b4-b059-48b6-ae81-ced1a447f10d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1011.580454] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1011.580454] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1012.660325] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1015.431134] env[67169]: WARNING oslo_vmware.rw_handles [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1015.431134] env[67169]: ERROR oslo_vmware.rw_handles [ 1015.431983] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/99155099-bdd2-483d-b8ea-a97dc199a30e/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1015.433919] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1015.435188] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Copying Virtual Disk [datastore2] vmware_temp/99155099-bdd2-483d-b8ea-a97dc199a30e/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/99155099-bdd2-483d-b8ea-a97dc199a30e/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1015.435188] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-007dd241-0c7e-400f-945f-6153b0cc79a8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.442096] env[67169]: DEBUG oslo_vmware.api [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1015.442096] env[67169]: value = "task-2819136" [ 1015.442096] env[67169]: _type = "Task" [ 1015.442096] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1015.450543] env[67169]: DEBUG oslo_vmware.api [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819136, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1015.954667] env[67169]: DEBUG oslo_vmware.exceptions [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1015.955016] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1015.955892] env[67169]: ERROR nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1015.955892] env[67169]: Faults: ['InvalidArgument'] [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Traceback (most recent call last): [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] yield resources [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] self.driver.spawn(context, instance, image_meta, [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] self._fetch_image_if_missing(context, vi) [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] image_cache(vi, tmp_image_ds_loc) [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] vm_util.copy_virtual_disk( [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] session._wait_for_task(vmdk_copy_task) [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] return self.wait_for_task(task_ref) [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] return evt.wait() [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] result = hub.switch() [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] return self.greenlet.switch() [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] self.f(*self.args, **self.kw) [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] raise exceptions.translate_fault(task_info.error) [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Faults: ['InvalidArgument'] [ 1015.955892] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] [ 1015.956687] env[67169]: INFO nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Terminating instance [ 1015.957810] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1015.958041] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1015.958432] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-263b237f-7c0e-4359-8b14-a09313af0808 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.963130] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1015.963130] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1015.963130] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84d28652-a3e5-401a-8b66-c87a325f0e7e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.971187] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1015.971187] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7ec9cc47-9d26-43bb-b246-fcf546c8eacb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.972170] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1015.972600] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1015.973304] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa22143a-3b06-45a4-8a21-69abbefc8743 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.981120] env[67169]: DEBUG oslo_vmware.api [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Waiting for the task: (returnval){ [ 1015.981120] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52c11496-1f76-5ff0-4512-861bb49997f6" [ 1015.981120] env[67169]: _type = "Task" [ 1015.981120] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1015.990015] env[67169]: DEBUG oslo_vmware.api [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52c11496-1f76-5ff0-4512-861bb49997f6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1016.043947] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1016.044210] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1016.044404] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleting the datastore file [datastore2] 85978a3b-052a-4a05-84e6-75c723d49bd8 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1016.044689] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-68f43293-c5d4-44f9-8bcb-0fe016dc578d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.051722] env[67169]: DEBUG oslo_vmware.api [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1016.051722] env[67169]: value = "task-2819138" [ 1016.051722] env[67169]: _type = "Task" [ 1016.051722] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1016.060294] env[67169]: DEBUG oslo_vmware.api [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819138, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1016.489710] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1016.490016] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Creating directory with path [datastore2] vmware_temp/32dc1870-cde8-4dea-9b14-3470fd1e7df7/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1016.490320] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ddbfe58f-1687-4db9-9e07-8fabc394f8b6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.501964] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Created directory with path [datastore2] vmware_temp/32dc1870-cde8-4dea-9b14-3470fd1e7df7/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1016.502308] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Fetch image to [datastore2] vmware_temp/32dc1870-cde8-4dea-9b14-3470fd1e7df7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1016.502719] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/32dc1870-cde8-4dea-9b14-3470fd1e7df7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1016.503307] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ba611fd-a278-44c8-a11f-e6e16ce46dca {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.511058] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a0c5928-3de0-478b-b735-3c8a3c16cbb4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.522257] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28a56641-675e-4b93-8563-d25a6710a928 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.564374] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-791d56de-8a2a-4278-a02a-f2c7437b4bde {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.573655] env[67169]: DEBUG oslo_vmware.api [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819138, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079258} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1016.574217] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1016.574411] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1016.574584] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1016.574757] env[67169]: INFO nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1016.576388] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3ba22f13-09ba-4782-94c8-ed9853f19561 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.578425] env[67169]: DEBUG nova.compute.claims [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1016.578643] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1016.579354] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1016.602941] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1016.690689] env[67169]: DEBUG oslo_vmware.rw_handles [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32dc1870-cde8-4dea-9b14-3470fd1e7df7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1016.758270] env[67169]: DEBUG oslo_vmware.rw_handles [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1016.758515] env[67169]: DEBUG oslo_vmware.rw_handles [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32dc1870-cde8-4dea-9b14-3470fd1e7df7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1017.141384] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dd962d3-dda4-47d0-bd42-18cb1a60afa4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.149439] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d27bb014-a3c8-4ea0-9c95-a011a82a269d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.179967] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6935663a-1a2e-4ffb-927d-523704140ac7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.187882] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acb81b85-adc5-4d96-a15e-f30b92a1844b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.201624] env[67169]: DEBUG nova.compute.provider_tree [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1017.216242] env[67169]: DEBUG nova.scheduler.client.report [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1017.238900] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.660s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1017.239477] env[67169]: ERROR nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1017.239477] env[67169]: Faults: ['InvalidArgument'] [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Traceback (most recent call last): [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] self.driver.spawn(context, instance, image_meta, [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] self._fetch_image_if_missing(context, vi) [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] image_cache(vi, tmp_image_ds_loc) [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] vm_util.copy_virtual_disk( [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] session._wait_for_task(vmdk_copy_task) [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] return self.wait_for_task(task_ref) [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] return evt.wait() [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] result = hub.switch() [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] return self.greenlet.switch() [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] self.f(*self.args, **self.kw) [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] raise exceptions.translate_fault(task_info.error) [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Faults: ['InvalidArgument'] [ 1017.239477] env[67169]: ERROR nova.compute.manager [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] [ 1017.240209] env[67169]: DEBUG nova.compute.utils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1017.242068] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Build of instance 85978a3b-052a-4a05-84e6-75c723d49bd8 was re-scheduled: A specified parameter was not correct: fileType [ 1017.242068] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1017.242636] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1017.243063] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1017.243309] env[67169]: DEBUG nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1017.243523] env[67169]: DEBUG nova.network.neutron [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1017.882359] env[67169]: DEBUG nova.network.neutron [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1017.897124] env[67169]: INFO nova.compute.manager [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Took 0.65 seconds to deallocate network for instance. [ 1018.029876] env[67169]: INFO nova.scheduler.client.report [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleted allocations for instance 85978a3b-052a-4a05-84e6-75c723d49bd8 [ 1018.059217] env[67169]: DEBUG oslo_concurrency.lockutils [None req-91fc429c-9045-417c-aff0-aeb3f0fac0f7 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 432.626s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1018.061195] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 432.374s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1018.061195] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c280ecac-e0b7-44c6-a8ac-cd657a11e2b4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1018.072043] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ba67817-627f-4350-b507-2872475efda0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1018.084846] env[67169]: DEBUG nova.compute.manager [None req-460b6927-69b3-4804-a4cd-b807c665f771 tempest-TenantUsagesTestJSON-1528703859 tempest-TenantUsagesTestJSON-1528703859-project-member] [instance: 41ebddaf-e07d-4925-b9da-758b8e83f545] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1018.119969] env[67169]: DEBUG nova.compute.manager [None req-460b6927-69b3-4804-a4cd-b807c665f771 tempest-TenantUsagesTestJSON-1528703859 tempest-TenantUsagesTestJSON-1528703859-project-member] [instance: 41ebddaf-e07d-4925-b9da-758b8e83f545] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1018.143732] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] During the sync_power process the instance has moved from host None to host cpu-1 [ 1018.143863] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.083s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1018.144645] env[67169]: DEBUG oslo_concurrency.lockutils [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 234.269s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1018.144645] env[67169]: DEBUG oslo_concurrency.lockutils [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "85978a3b-052a-4a05-84e6-75c723d49bd8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1018.144645] env[67169]: DEBUG oslo_concurrency.lockutils [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1018.144645] env[67169]: DEBUG oslo_concurrency.lockutils [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1018.147075] env[67169]: DEBUG oslo_concurrency.lockutils [None req-460b6927-69b3-4804-a4cd-b807c665f771 tempest-TenantUsagesTestJSON-1528703859 tempest-TenantUsagesTestJSON-1528703859-project-member] Lock "41ebddaf-e07d-4925-b9da-758b8e83f545" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.196s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1018.148819] env[67169]: INFO nova.compute.manager [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Terminating instance [ 1018.150892] env[67169]: DEBUG nova.compute.manager [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1018.151139] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1018.151409] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c2551eae-04a7-4d56-b8e3-78ec98474fa2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1018.163398] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0a9c5f5-2fbe-44cf-9d18-05a0f242ee29 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1018.174491] env[67169]: DEBUG nova.compute.manager [None req-2a59dd09-c1a1-4dc5-8341-628a27b9a7ff tempest-ServerMetadataTestJSON-895650985 tempest-ServerMetadataTestJSON-895650985-project-member] [instance: 3c5bc03a-acc3-4601-8155-2cab101be865] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1018.196034] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 85978a3b-052a-4a05-84e6-75c723d49bd8 could not be found. [ 1018.196242] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1018.196511] env[67169]: INFO nova.compute.manager [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1018.196654] env[67169]: DEBUG oslo.service.loopingcall [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1018.196865] env[67169]: DEBUG nova.compute.manager [-] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1018.196962] env[67169]: DEBUG nova.network.neutron [-] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1018.200269] env[67169]: DEBUG nova.compute.manager [None req-2a59dd09-c1a1-4dc5-8341-628a27b9a7ff tempest-ServerMetadataTestJSON-895650985 tempest-ServerMetadataTestJSON-895650985-project-member] [instance: 3c5bc03a-acc3-4601-8155-2cab101be865] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1018.225054] env[67169]: DEBUG nova.network.neutron [-] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1018.226940] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2a59dd09-c1a1-4dc5-8341-628a27b9a7ff tempest-ServerMetadataTestJSON-895650985 tempest-ServerMetadataTestJSON-895650985-project-member] Lock "3c5bc03a-acc3-4601-8155-2cab101be865" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.541s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1018.236870] env[67169]: INFO nova.compute.manager [-] [instance: 85978a3b-052a-4a05-84e6-75c723d49bd8] Took 0.04 seconds to deallocate network for instance. [ 1018.242125] env[67169]: DEBUG nova.compute.manager [None req-f501c074-65dc-493f-a6db-8ace51085315 tempest-ServerMetadataNegativeTestJSON-1861165825 tempest-ServerMetadataNegativeTestJSON-1861165825-project-member] [instance: 3ea7c620-5903-410d-8ca0-68789a5e5194] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1018.271186] env[67169]: DEBUG nova.compute.manager [None req-f501c074-65dc-493f-a6db-8ace51085315 tempest-ServerMetadataNegativeTestJSON-1861165825 tempest-ServerMetadataNegativeTestJSON-1861165825-project-member] [instance: 3ea7c620-5903-410d-8ca0-68789a5e5194] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1018.305433] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f501c074-65dc-493f-a6db-8ace51085315 tempest-ServerMetadataNegativeTestJSON-1861165825 tempest-ServerMetadataNegativeTestJSON-1861165825-project-member] Lock "3ea7c620-5903-410d-8ca0-68789a5e5194" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.917s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1018.318781] env[67169]: DEBUG nova.compute.manager [None req-c5227c3b-bf91-43b9-b202-c55d927f7c0d tempest-ServerShowV254Test-307983608 tempest-ServerShowV254Test-307983608-project-member] [instance: d0a7fdac-3e41-4539-bef4-0442bc5ad674] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1018.361589] env[67169]: DEBUG nova.compute.manager [None req-c5227c3b-bf91-43b9-b202-c55d927f7c0d tempest-ServerShowV254Test-307983608 tempest-ServerShowV254Test-307983608-project-member] [instance: d0a7fdac-3e41-4539-bef4-0442bc5ad674] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1018.380813] env[67169]: DEBUG oslo_concurrency.lockutils [None req-21f766a3-9d39-453b-aab0-47af759d94fc tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "85978a3b-052a-4a05-84e6-75c723d49bd8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.236s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1018.397318] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c5227c3b-bf91-43b9-b202-c55d927f7c0d tempest-ServerShowV254Test-307983608 tempest-ServerShowV254Test-307983608-project-member] Lock "d0a7fdac-3e41-4539-bef4-0442bc5ad674" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.910s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1018.410110] env[67169]: DEBUG nova.compute.manager [None req-444db087-1355-4d7a-9f19-a97c8aa302b0 tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] [instance: 2aa47f31-4da8-4ef6-b28c-fe2a03bf8906] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1018.437967] env[67169]: DEBUG nova.compute.manager [None req-444db087-1355-4d7a-9f19-a97c8aa302b0 tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] [instance: 2aa47f31-4da8-4ef6-b28c-fe2a03bf8906] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1018.465267] env[67169]: DEBUG oslo_concurrency.lockutils [None req-444db087-1355-4d7a-9f19-a97c8aa302b0 tempest-AttachVolumeShelveTestJSON-1479191573 tempest-AttachVolumeShelveTestJSON-1479191573-project-member] Lock "2aa47f31-4da8-4ef6-b28c-fe2a03bf8906" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.823s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1018.476657] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1018.545344] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1018.545604] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1018.547109] env[67169]: INFO nova.compute.claims [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1018.999980] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bc2b5d6-3b0e-45f3-9184-74b9e799f35c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1019.007866] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c30be2e-435e-446d-8390-445ec92ce07c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1019.039193] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31c4f198-feb4-4ed5-b55b-c9e7387cc089 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1019.046496] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b224ed03-e39d-4f35-ba31-e613ca7eaa53 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1019.061949] env[67169]: DEBUG nova.compute.provider_tree [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1019.070062] env[67169]: DEBUG nova.scheduler.client.report [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1019.086417] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.540s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1019.086597] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1019.121685] env[67169]: DEBUG nova.compute.utils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1019.123945] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1019.124173] env[67169]: DEBUG nova.network.neutron [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1019.139870] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1019.215582] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1019.222954] env[67169]: DEBUG nova.policy [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '934e7822c4b64eae918a7bf4a8cc9156', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '81f09255d4d24106b7989baf1ea104bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1019.241847] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1019.241847] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1019.241847] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1019.241847] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1019.241847] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1019.241847] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1019.241847] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1019.241847] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1019.242678] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1019.243027] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1019.243350] env[67169]: DEBUG nova.virt.hardware [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1019.244556] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10dbc1c5-9266-4350-825f-068225b4c9a6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1019.253066] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-016a800d-b474-4301-aed2-2130b7d514de {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1020.023240] env[67169]: DEBUG nova.network.neutron [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Successfully created port: 132381eb-7e6b-4919-933c-38fe84614f37 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1020.295440] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c978442b-871a-4422-ae40-03e18147f708 tempest-ServerRescueTestJSON-1216610948 tempest-ServerRescueTestJSON-1216610948-project-member] Acquiring lock "1a04a0fd-11d5-4fce-ba32-d90e39a13ff9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1020.298833] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c978442b-871a-4422-ae40-03e18147f708 tempest-ServerRescueTestJSON-1216610948 tempest-ServerRescueTestJSON-1216610948-project-member] Lock "1a04a0fd-11d5-4fce-ba32-d90e39a13ff9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1020.933330] env[67169]: DEBUG nova.compute.manager [req-48a26fd6-8ff4-41dd-a26b-6591e2b8cf94 req-0c09334b-69bd-4306-9545-508d8a009bc9 service nova] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Received event network-vif-plugged-132381eb-7e6b-4919-933c-38fe84614f37 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1020.933330] env[67169]: DEBUG oslo_concurrency.lockutils [req-48a26fd6-8ff4-41dd-a26b-6591e2b8cf94 req-0c09334b-69bd-4306-9545-508d8a009bc9 service nova] Acquiring lock "310ae1ce-4717-4807-901c-5674677682c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1020.933486] env[67169]: DEBUG oslo_concurrency.lockutils [req-48a26fd6-8ff4-41dd-a26b-6591e2b8cf94 req-0c09334b-69bd-4306-9545-508d8a009bc9 service nova] Lock "310ae1ce-4717-4807-901c-5674677682c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1020.933746] env[67169]: DEBUG oslo_concurrency.lockutils [req-48a26fd6-8ff4-41dd-a26b-6591e2b8cf94 req-0c09334b-69bd-4306-9545-508d8a009bc9 service nova] Lock "310ae1ce-4717-4807-901c-5674677682c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1020.933925] env[67169]: DEBUG nova.compute.manager [req-48a26fd6-8ff4-41dd-a26b-6591e2b8cf94 req-0c09334b-69bd-4306-9545-508d8a009bc9 service nova] [instance: 310ae1ce-4717-4807-901c-5674677682c3] No waiting events found dispatching network-vif-plugged-132381eb-7e6b-4919-933c-38fe84614f37 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1020.937801] env[67169]: WARNING nova.compute.manager [req-48a26fd6-8ff4-41dd-a26b-6591e2b8cf94 req-0c09334b-69bd-4306-9545-508d8a009bc9 service nova] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Received unexpected event network-vif-plugged-132381eb-7e6b-4919-933c-38fe84614f37 for instance with vm_state building and task_state spawning. [ 1021.057144] env[67169]: DEBUG nova.network.neutron [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Successfully updated port: 132381eb-7e6b-4919-933c-38fe84614f37 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1021.072778] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquiring lock "refresh_cache-310ae1ce-4717-4807-901c-5674677682c3" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1021.072950] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquired lock "refresh_cache-310ae1ce-4717-4807-901c-5674677682c3" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1021.073126] env[67169]: DEBUG nova.network.neutron [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1021.128733] env[67169]: DEBUG nova.network.neutron [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1021.378791] env[67169]: DEBUG nova.network.neutron [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Updating instance_info_cache with network_info: [{"id": "132381eb-7e6b-4919-933c-38fe84614f37", "address": "fa:16:3e:79:0e:e4", "network": {"id": "103592f9-9e79-499a-b714-36cdf5639f5e", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-2056876988-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "81f09255d4d24106b7989baf1ea104bb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20641d67-1612-4b9c-8924-7a77df9c8e6d", "external-id": "nsx-vlan-transportzone-884", "segmentation_id": 884, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap132381eb-7e", "ovs_interfaceid": "132381eb-7e6b-4919-933c-38fe84614f37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1021.395575] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Releasing lock "refresh_cache-310ae1ce-4717-4807-901c-5674677682c3" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1021.395965] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Instance network_info: |[{"id": "132381eb-7e6b-4919-933c-38fe84614f37", "address": "fa:16:3e:79:0e:e4", "network": {"id": "103592f9-9e79-499a-b714-36cdf5639f5e", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-2056876988-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "81f09255d4d24106b7989baf1ea104bb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20641d67-1612-4b9c-8924-7a77df9c8e6d", "external-id": "nsx-vlan-transportzone-884", "segmentation_id": 884, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap132381eb-7e", "ovs_interfaceid": "132381eb-7e6b-4919-933c-38fe84614f37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1021.396411] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:79:0e:e4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '20641d67-1612-4b9c-8924-7a77df9c8e6d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '132381eb-7e6b-4919-933c-38fe84614f37', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1021.405650] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Creating folder: Project (81f09255d4d24106b7989baf1ea104bb). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1021.406259] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a9379951-f715-480e-bd05-d5b550795f3c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1021.417182] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Created folder: Project (81f09255d4d24106b7989baf1ea104bb) in parent group-v566843. [ 1021.417394] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Creating folder: Instances. Parent ref: group-v566903. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1021.417645] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5853c2fd-d571-42c3-8d62-3253c0b8ab40 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1021.426456] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Created folder: Instances in parent group-v566903. [ 1021.426715] env[67169]: DEBUG oslo.service.loopingcall [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1021.426902] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1021.427117] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-922563f7-6511-4af6-bae3-a65e92da8cf1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1021.449021] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1021.449021] env[67169]: value = "task-2819143" [ 1021.449021] env[67169]: _type = "Task" [ 1021.449021] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1021.457535] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819143, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1021.957013] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819143, 'name': CreateVM_Task, 'duration_secs': 0.324138} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1021.957168] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1021.958280] env[67169]: DEBUG oslo_vmware.service [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccd096e4-5ef4-494f-98d2-3814b0c8d553 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1021.963903] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1021.964077] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquired lock "[datastore1] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1021.964455] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1021.965016] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4fee860f-72ff-41c8-b968-69a8d75d6b07 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1021.969223] env[67169]: DEBUG oslo_vmware.api [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Waiting for the task: (returnval){ [ 1021.969223] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5214f987-33c1-31c5-cb77-5dcc2cd20280" [ 1021.969223] env[67169]: _type = "Task" [ 1021.969223] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1021.980047] env[67169]: DEBUG oslo_vmware.api [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5214f987-33c1-31c5-cb77-5dcc2cd20280, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1022.480672] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Releasing lock "[datastore1] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1022.480958] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1022.481263] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1022.481396] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquired lock "[datastore1] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1022.481568] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1022.482846] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-009b0961-c7bf-4bf8-abb0-00a25d636f96 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1022.489769] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1022.489949] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1022.490726] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b91b4d0f-8d32-4f9b-a53f-769822ce1241 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1022.496845] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-99e6e7e1-4bea-444d-91e8-fde529c3fd31 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1022.501978] env[67169]: DEBUG oslo_vmware.api [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Waiting for the task: (returnval){ [ 1022.501978] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a91da9-ab58-4bf2-7fd4-49eb565b8357" [ 1022.501978] env[67169]: _type = "Task" [ 1022.501978] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1022.509145] env[67169]: DEBUG oslo_vmware.api [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a91da9-ab58-4bf2-7fd4-49eb565b8357, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1022.957589] env[67169]: DEBUG nova.compute.manager [req-f1755ed2-f11c-4af4-96d1-c7ffc1abe547 req-d90b4760-7b47-48ba-b3fb-b9dd1a73b84f service nova] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Received event network-changed-132381eb-7e6b-4919-933c-38fe84614f37 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1022.957800] env[67169]: DEBUG nova.compute.manager [req-f1755ed2-f11c-4af4-96d1-c7ffc1abe547 req-d90b4760-7b47-48ba-b3fb-b9dd1a73b84f service nova] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Refreshing instance network info cache due to event network-changed-132381eb-7e6b-4919-933c-38fe84614f37. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1022.958015] env[67169]: DEBUG oslo_concurrency.lockutils [req-f1755ed2-f11c-4af4-96d1-c7ffc1abe547 req-d90b4760-7b47-48ba-b3fb-b9dd1a73b84f service nova] Acquiring lock "refresh_cache-310ae1ce-4717-4807-901c-5674677682c3" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1022.958168] env[67169]: DEBUG oslo_concurrency.lockutils [req-f1755ed2-f11c-4af4-96d1-c7ffc1abe547 req-d90b4760-7b47-48ba-b3fb-b9dd1a73b84f service nova] Acquired lock "refresh_cache-310ae1ce-4717-4807-901c-5674677682c3" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1022.958330] env[67169]: DEBUG nova.network.neutron [req-f1755ed2-f11c-4af4-96d1-c7ffc1abe547 req-d90b4760-7b47-48ba-b3fb-b9dd1a73b84f service nova] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Refreshing network info cache for port 132381eb-7e6b-4919-933c-38fe84614f37 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1023.013324] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1023.013589] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Creating directory with path [datastore1] vmware_temp/64e80115-39a8-4b58-8367-786c1673ba1a/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1023.013822] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8176c0ce-8424-43e4-bf38-8425503779c5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.037139] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Created directory with path [datastore1] vmware_temp/64e80115-39a8-4b58-8367-786c1673ba1a/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1023.037359] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Fetch image to [datastore1] vmware_temp/64e80115-39a8-4b58-8367-786c1673ba1a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1023.037532] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore1] vmware_temp/64e80115-39a8-4b58-8367-786c1673ba1a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore1 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1023.038307] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3de9aece-3f4f-47f9-acb4-5dcd74639c08 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.045174] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33d1f85c-334e-46b6-9d88-81531e601e44 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.053984] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-667266c0-b500-42bc-864b-4e63de8b7ebd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.084877] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb0ec339-4b8d-4a94-8d07-ba7ab61decb3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.090789] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-866f0e9d-4162-45e2-9c54-d7727218e6ed {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.113255] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore1 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1023.165043] env[67169]: DEBUG oslo_vmware.rw_handles [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/64e80115-39a8-4b58-8367-786c1673ba1a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1023.225582] env[67169]: DEBUG oslo_vmware.rw_handles [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1023.226115] env[67169]: DEBUG oslo_vmware.rw_handles [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/64e80115-39a8-4b58-8367-786c1673ba1a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1023.552917] env[67169]: DEBUG nova.network.neutron [req-f1755ed2-f11c-4af4-96d1-c7ffc1abe547 req-d90b4760-7b47-48ba-b3fb-b9dd1a73b84f service nova] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Updated VIF entry in instance network info cache for port 132381eb-7e6b-4919-933c-38fe84614f37. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1023.553275] env[67169]: DEBUG nova.network.neutron [req-f1755ed2-f11c-4af4-96d1-c7ffc1abe547 req-d90b4760-7b47-48ba-b3fb-b9dd1a73b84f service nova] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Updating instance_info_cache with network_info: [{"id": "132381eb-7e6b-4919-933c-38fe84614f37", "address": "fa:16:3e:79:0e:e4", "network": {"id": "103592f9-9e79-499a-b714-36cdf5639f5e", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-2056876988-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "81f09255d4d24106b7989baf1ea104bb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20641d67-1612-4b9c-8924-7a77df9c8e6d", "external-id": "nsx-vlan-transportzone-884", "segmentation_id": 884, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap132381eb-7e", "ovs_interfaceid": "132381eb-7e6b-4919-933c-38fe84614f37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1023.562906] env[67169]: DEBUG oslo_concurrency.lockutils [req-f1755ed2-f11c-4af4-96d1-c7ffc1abe547 req-d90b4760-7b47-48ba-b3fb-b9dd1a73b84f service nova] Releasing lock "refresh_cache-310ae1ce-4717-4807-901c-5674677682c3" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1024.692734] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1024.693061] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1026.810020] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquiring lock "310ae1ce-4717-4807-901c-5674677682c3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1032.947495] env[67169]: DEBUG oslo_concurrency.lockutils [None req-57d7723f-f308-4517-8973-d308d992f41e tempest-ServersTestBootFromVolume-711128650 tempest-ServersTestBootFromVolume-711128650-project-member] Acquiring lock "3966e1f7-2107-4ddb-8077-ab37ef1a9b92" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1032.947900] env[67169]: DEBUG oslo_concurrency.lockutils [None req-57d7723f-f308-4517-8973-d308d992f41e tempest-ServersTestBootFromVolume-711128650 tempest-ServersTestBootFromVolume-711128650-project-member] Lock "3966e1f7-2107-4ddb-8077-ab37ef1a9b92" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1053.766304] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4774cbe-50db-4e49-ba06-358bb98216ff tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquiring lock "c57df23b-3348-41fa-a976-421f98cab569" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1053.766636] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4774cbe-50db-4e49-ba06-358bb98216ff tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "c57df23b-3348-41fa-a976-421f98cab569" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1054.268519] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ef0a7f94-c032-4a1f-af92-8f4abde4bd65 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] Acquiring lock "c769a8f3-6f9f-4e5b-bfec-345c97da5d83" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1054.268714] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ef0a7f94-c032-4a1f-af92-8f4abde4bd65 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] Lock "c769a8f3-6f9f-4e5b-bfec-345c97da5d83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1064.660757] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1065.444453] env[67169]: WARNING oslo_vmware.rw_handles [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1065.444453] env[67169]: ERROR oslo_vmware.rw_handles [ 1065.445066] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/32dc1870-cde8-4dea-9b14-3470fd1e7df7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1065.446735] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1065.446985] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Copying Virtual Disk [datastore2] vmware_temp/32dc1870-cde8-4dea-9b14-3470fd1e7df7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/32dc1870-cde8-4dea-9b14-3470fd1e7df7/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1065.447351] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-33d36297-8a21-4824-b879-fa8b630e7855 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1065.456279] env[67169]: DEBUG oslo_vmware.api [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Waiting for the task: (returnval){ [ 1065.456279] env[67169]: value = "task-2819149" [ 1065.456279] env[67169]: _type = "Task" [ 1065.456279] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1065.464453] env[67169]: DEBUG oslo_vmware.api [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Task: {'id': task-2819149, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1065.967493] env[67169]: DEBUG oslo_vmware.exceptions [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1065.967878] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1065.968369] env[67169]: ERROR nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1065.968369] env[67169]: Faults: ['InvalidArgument'] [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Traceback (most recent call last): [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] yield resources [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] self.driver.spawn(context, instance, image_meta, [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] self._fetch_image_if_missing(context, vi) [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] image_cache(vi, tmp_image_ds_loc) [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] vm_util.copy_virtual_disk( [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] session._wait_for_task(vmdk_copy_task) [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] return self.wait_for_task(task_ref) [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] return evt.wait() [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] result = hub.switch() [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] return self.greenlet.switch() [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] self.f(*self.args, **self.kw) [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] raise exceptions.translate_fault(task_info.error) [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Faults: ['InvalidArgument'] [ 1065.968369] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] [ 1065.969648] env[67169]: INFO nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Terminating instance [ 1065.970253] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1065.970463] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1065.970949] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "refresh_cache-e2e52693-153a-43dd-b786-dd0758caabe2" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1065.971160] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquired lock "refresh_cache-e2e52693-153a-43dd-b786-dd0758caabe2" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1065.971305] env[67169]: DEBUG nova.network.neutron [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1065.972231] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2e353429-1163-43fd-9e57-ed35f6d030e4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1065.982042] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1065.982260] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1065.983270] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-70598b95-db28-4249-bc66-8e968a23e048 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1065.988648] env[67169]: DEBUG oslo_vmware.api [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Waiting for the task: (returnval){ [ 1065.988648] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5282727e-bc71-c891-a154-ee580e873e5c" [ 1065.988648] env[67169]: _type = "Task" [ 1065.988648] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1065.996801] env[67169]: DEBUG oslo_vmware.api [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5282727e-bc71-c891-a154-ee580e873e5c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1066.010335] env[67169]: DEBUG nova.network.neutron [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1066.085958] env[67169]: DEBUG nova.network.neutron [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1066.095053] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Releasing lock "refresh_cache-e2e52693-153a-43dd-b786-dd0758caabe2" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1066.095464] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1066.095659] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1066.096720] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cfeaca4-f55b-4e34-b9ab-99a3def06b79 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.104864] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1066.104864] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-915e1c64-1e34-4178-916f-b4cb21372505 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.130030] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1066.130245] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1066.130423] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Deleting the datastore file [datastore2] e2e52693-153a-43dd-b786-dd0758caabe2 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1066.130653] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-62bafe59-222f-4060-8d52-d18c809bfa60 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.136532] env[67169]: DEBUG oslo_vmware.api [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Waiting for the task: (returnval){ [ 1066.136532] env[67169]: value = "task-2819151" [ 1066.136532] env[67169]: _type = "Task" [ 1066.136532] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1066.143254] env[67169]: DEBUG oslo_vmware.api [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Task: {'id': task-2819151, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1066.498271] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1066.498532] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Creating directory with path [datastore2] vmware_temp/028de748-8cfe-485d-ae8e-8db51841361b/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1066.498768] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1f172173-cfc2-4ebb-9a34-afb2e4ab0351 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.509542] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Created directory with path [datastore2] vmware_temp/028de748-8cfe-485d-ae8e-8db51841361b/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1066.509729] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Fetch image to [datastore2] vmware_temp/028de748-8cfe-485d-ae8e-8db51841361b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1066.509896] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/028de748-8cfe-485d-ae8e-8db51841361b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1066.510920] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-919c66e7-c00d-411d-8631-fe3351da982a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.517209] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b5f915a-bc37-4acc-8907-e05373bafbab {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.527368] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63142a14-79b1-4643-82c1-0e6b702a184e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.557745] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c0ef387-68c0-4408-ac90-b556ba045ee7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.563996] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-529434e7-5771-4901-96ec-f32a6ce46876 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.585554] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1066.638735] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/028de748-8cfe-485d-ae8e-8db51841361b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1066.716053] env[67169]: DEBUG oslo_vmware.api [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Task: {'id': task-2819151, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.042477} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1066.717399] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1066.717595] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1066.717773] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1066.717982] env[67169]: INFO nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1066.718251] env[67169]: DEBUG oslo.service.loopingcall [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1066.718687] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1066.718846] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/028de748-8cfe-485d-ae8e-8db51841361b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1066.719142] env[67169]: DEBUG nova.compute.manager [-] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Skipping network deallocation for instance since networking was not requested. {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1066.721371] env[67169]: DEBUG nova.compute.claims [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1066.721587] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1066.721690] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1067.049508] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42d73b9d-f382-4767-88c3-8761794b05a7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.057400] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e5b830c-7cc9-4fbb-a042-8f8541295d24 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.087307] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aefa4e50-ac22-4fc3-b447-be76a3c0ec8a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.094203] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-024ea994-199c-4c3c-84cc-7a112ea88c0d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.106830] env[67169]: DEBUG nova.compute.provider_tree [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1067.114753] env[67169]: DEBUG nova.scheduler.client.report [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1067.129716] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.408s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1067.130240] env[67169]: ERROR nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1067.130240] env[67169]: Faults: ['InvalidArgument'] [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Traceback (most recent call last): [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] self.driver.spawn(context, instance, image_meta, [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] self._fetch_image_if_missing(context, vi) [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] image_cache(vi, tmp_image_ds_loc) [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] vm_util.copy_virtual_disk( [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] session._wait_for_task(vmdk_copy_task) [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] return self.wait_for_task(task_ref) [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] return evt.wait() [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] result = hub.switch() [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] return self.greenlet.switch() [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] self.f(*self.args, **self.kw) [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] raise exceptions.translate_fault(task_info.error) [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Faults: ['InvalidArgument'] [ 1067.130240] env[67169]: ERROR nova.compute.manager [instance: e2e52693-153a-43dd-b786-dd0758caabe2] [ 1067.131194] env[67169]: DEBUG nova.compute.utils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1067.132357] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Build of instance e2e52693-153a-43dd-b786-dd0758caabe2 was re-scheduled: A specified parameter was not correct: fileType [ 1067.132357] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1067.132732] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1067.132954] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "refresh_cache-e2e52693-153a-43dd-b786-dd0758caabe2" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1067.133115] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquired lock "refresh_cache-e2e52693-153a-43dd-b786-dd0758caabe2" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1067.133273] env[67169]: DEBUG nova.network.neutron [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1067.157557] env[67169]: DEBUG nova.network.neutron [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1067.421498] env[67169]: DEBUG nova.network.neutron [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1067.431302] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Releasing lock "refresh_cache-e2e52693-153a-43dd-b786-dd0758caabe2" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1067.431627] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1067.431845] env[67169]: DEBUG nova.compute.manager [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Skipping network deallocation for instance since networking was not requested. {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1067.533202] env[67169]: INFO nova.scheduler.client.report [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Deleted allocations for instance e2e52693-153a-43dd-b786-dd0758caabe2 [ 1067.554429] env[67169]: DEBUG oslo_concurrency.lockutils [None req-efa53bcc-a9ef-46a2-ad67-84fb4c5f88c0 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "e2e52693-153a-43dd-b786-dd0758caabe2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 473.387s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1067.555545] env[67169]: DEBUG oslo_concurrency.lockutils [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "e2e52693-153a-43dd-b786-dd0758caabe2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 274.194s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1067.555766] env[67169]: DEBUG oslo_concurrency.lockutils [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "e2e52693-153a-43dd-b786-dd0758caabe2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1067.555966] env[67169]: DEBUG oslo_concurrency.lockutils [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "e2e52693-153a-43dd-b786-dd0758caabe2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1067.556150] env[67169]: DEBUG oslo_concurrency.lockutils [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "e2e52693-153a-43dd-b786-dd0758caabe2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1067.558071] env[67169]: INFO nova.compute.manager [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Terminating instance [ 1067.559611] env[67169]: DEBUG oslo_concurrency.lockutils [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquiring lock "refresh_cache-e2e52693-153a-43dd-b786-dd0758caabe2" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1067.560066] env[67169]: DEBUG oslo_concurrency.lockutils [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Acquired lock "refresh_cache-e2e52693-153a-43dd-b786-dd0758caabe2" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1067.560066] env[67169]: DEBUG nova.network.neutron [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1067.570210] env[67169]: DEBUG nova.compute.manager [None req-0dc9c73e-95b5-4ebc-a938-23f4adfd1752 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] [instance: 32412c58-a231-40f7-a248-3e46fad5f5b2] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1067.593416] env[67169]: DEBUG nova.network.neutron [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1067.624258] env[67169]: DEBUG nova.compute.manager [None req-0dc9c73e-95b5-4ebc-a938-23f4adfd1752 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] [instance: 32412c58-a231-40f7-a248-3e46fad5f5b2] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1067.646806] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dc9c73e-95b5-4ebc-a938-23f4adfd1752 tempest-SecurityGroupsTestJSON-292577190 tempest-SecurityGroupsTestJSON-292577190-project-member] Lock "32412c58-a231-40f7-a248-3e46fad5f5b2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.156s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1067.657911] env[67169]: DEBUG nova.compute.manager [None req-9f97b7a6-ba6f-461c-81b3-2c7be85e3cf6 tempest-ServerAddressesTestJSON-1751036838 tempest-ServerAddressesTestJSON-1751036838-project-member] [instance: 86b2381b-676f-46fc-9317-81c0fd272069] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1067.682951] env[67169]: DEBUG nova.network.neutron [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1067.687267] env[67169]: DEBUG nova.compute.manager [None req-9f97b7a6-ba6f-461c-81b3-2c7be85e3cf6 tempest-ServerAddressesTestJSON-1751036838 tempest-ServerAddressesTestJSON-1751036838-project-member] [instance: 86b2381b-676f-46fc-9317-81c0fd272069] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1067.693499] env[67169]: DEBUG oslo_concurrency.lockutils [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Releasing lock "refresh_cache-e2e52693-153a-43dd-b786-dd0758caabe2" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1067.693873] env[67169]: DEBUG nova.compute.manager [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1067.694142] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1067.694557] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d7a6569d-ebad-4a1a-8109-d77a4e74dd8b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.704560] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37a736c9-287c-4697-9064-22f4aa9f6024 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.715823] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9f97b7a6-ba6f-461c-81b3-2c7be85e3cf6 tempest-ServerAddressesTestJSON-1751036838 tempest-ServerAddressesTestJSON-1751036838-project-member] Lock "86b2381b-676f-46fc-9317-81c0fd272069" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.581s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1067.726845] env[67169]: DEBUG nova.compute.manager [None req-a4f964da-83fd-4c4e-9a9c-0a3b29f5dc8a tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] [instance: dac3617f-32fd-43c5-b8b5-fddf42d94f88] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1067.738449] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e2e52693-153a-43dd-b786-dd0758caabe2 could not be found. [ 1067.738661] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1067.738848] env[67169]: INFO nova.compute.manager [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1067.739083] env[67169]: DEBUG oslo.service.loopingcall [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1067.739524] env[67169]: DEBUG nova.compute.manager [-] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1067.739629] env[67169]: DEBUG nova.network.neutron [-] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1067.751921] env[67169]: DEBUG nova.compute.manager [None req-a4f964da-83fd-4c4e-9a9c-0a3b29f5dc8a tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] [instance: dac3617f-32fd-43c5-b8b5-fddf42d94f88] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1067.766097] env[67169]: DEBUG nova.network.neutron [-] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1067.773388] env[67169]: DEBUG nova.network.neutron [-] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1067.790284] env[67169]: INFO nova.compute.manager [-] [instance: e2e52693-153a-43dd-b786-dd0758caabe2] Took 0.05 seconds to deallocate network for instance. [ 1067.795558] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a4f964da-83fd-4c4e-9a9c-0a3b29f5dc8a tempest-VolumesAdminNegativeTest-1217570915 tempest-VolumesAdminNegativeTest-1217570915-project-member] Lock "dac3617f-32fd-43c5-b8b5-fddf42d94f88" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.342s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1067.806627] env[67169]: DEBUG nova.compute.manager [None req-7011d1c8-bd90-44bc-9571-5dca86d4a021 tempest-ServerTagsTestJSON-1585154186 tempest-ServerTagsTestJSON-1585154186-project-member] [instance: fc12247e-bcca-4635-ba27-be1c9aeaa368] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1067.844652] env[67169]: DEBUG nova.compute.manager [None req-7011d1c8-bd90-44bc-9571-5dca86d4a021 tempest-ServerTagsTestJSON-1585154186 tempest-ServerTagsTestJSON-1585154186-project-member] [instance: fc12247e-bcca-4635-ba27-be1c9aeaa368] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1067.869315] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7011d1c8-bd90-44bc-9571-5dca86d4a021 tempest-ServerTagsTestJSON-1585154186 tempest-ServerTagsTestJSON-1585154186-project-member] Lock "fc12247e-bcca-4635-ba27-be1c9aeaa368" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.055s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1067.878985] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1067.908867] env[67169]: DEBUG oslo_concurrency.lockutils [None req-969df851-d50e-45d8-a0e2-4316328b6a67 tempest-ServerDiagnosticsV248Test-1196521581 tempest-ServerDiagnosticsV248Test-1196521581-project-member] Lock "e2e52693-153a-43dd-b786-dd0758caabe2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.353s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1067.931564] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1067.931818] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1067.933324] env[67169]: INFO nova.compute.claims [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1068.248209] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d23c3eef-cb9d-40f0-ac23-c26a91a0b7ff {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.256164] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3e35ac9-f44a-4e43-8438-08bebb2ea846 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.286385] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e212781-77e7-4d5a-81fe-66d8d179f8f6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.293112] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24a4dcb0-f648-4f2b-a910-fbb710e99c03 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.305804] env[67169]: DEBUG nova.compute.provider_tree [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1068.314635] env[67169]: DEBUG nova.scheduler.client.report [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1068.327933] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.396s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1068.328418] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1068.361400] env[67169]: DEBUG nova.compute.utils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1068.363074] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Not allocating networking since 'none' was specified. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1068.371653] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1068.433111] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1068.459038] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1068.459327] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1068.459494] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1068.459675] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1068.459820] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1068.459964] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1068.460182] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1068.460379] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1068.460566] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1068.460729] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1068.460900] env[67169]: DEBUG nova.virt.hardware [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1068.461793] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-738a02d1-2e7a-40f9-bdcf-460ea67b4eb7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.469672] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a800541-f141-48a5-809b-655a4545b064 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.482769] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Instance VIF info [] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1068.488481] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Creating folder: Project (8f97d537d42047a3bf761bdb01f5cd94). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1068.488774] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ce093b7a-8343-4daa-aedf-db506d696b0d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.498049] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Created folder: Project (8f97d537d42047a3bf761bdb01f5cd94) in parent group-v566843. [ 1068.498230] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Creating folder: Instances. Parent ref: group-v566907. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1068.498427] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-18531aac-7499-4a67-99f0-645d634e9785 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.505947] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Created folder: Instances in parent group-v566907. [ 1068.506194] env[67169]: DEBUG oslo.service.loopingcall [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1068.506402] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1068.506590] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9421e353-6339-4209-97bf-1cba3f8e9d99 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.521478] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1068.521478] env[67169]: value = "task-2819154" [ 1068.521478] env[67169]: _type = "Task" [ 1068.521478] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1068.528283] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819154, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1068.658578] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1068.671388] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1068.671631] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1068.671802] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1068.671957] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1068.673075] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-545d855f-9691-4658-9fb8-63991f7dfb16 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.681875] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e9a09c6-1911-435e-b837-796b3e9f5c32 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.697532] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d49072f-1ba6-4cc1-a5fc-666e114e7016 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.704227] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5650f6b-a2a7-4c7c-bfb6-83dcf1984fdd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.734300] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181042MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1068.734300] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1068.734793] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 28552f70-695d-40cc-8dfa-bf40d6113220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 310ae1ce-4717-4807-901c-5674677682c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.802266] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1068.816613] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.818414] env[67169]: WARNING oslo_vmware.rw_handles [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1068.818414] env[67169]: ERROR oslo_vmware.rw_handles [ 1068.818797] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/64e80115-39a8-4b58-8367-786c1673ba1a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore1 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1068.820302] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1068.820569] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Copying Virtual Disk [datastore1] vmware_temp/64e80115-39a8-4b58-8367-786c1673ba1a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore1] vmware_temp/64e80115-39a8-4b58-8367-786c1673ba1a/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1068.821082] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e976d6e1-81e1-4059-a124-ff7f425b039c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.830028] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.831407] env[67169]: DEBUG oslo_vmware.api [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Waiting for the task: (returnval){ [ 1068.831407] env[67169]: value = "task-2819155" [ 1068.831407] env[67169]: _type = "Task" [ 1068.831407] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1068.840950] env[67169]: DEBUG oslo_vmware.api [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Task: {'id': task-2819155, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1068.841574] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.851980] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ca657a42-3745-46e1-8fc9-61de31f661d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.862170] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3f90c9a4-650d-4280-b155-1315d2f0f281 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.872677] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance d964ad35-8d3f-45f3-b799-aebddf295012 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.882618] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0b78afae-71e9-4ba9-903a-03c8a98cd91e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.894610] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 54b1337f-4ac8-4718-b273-2f078782b491 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.908502] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bf6857fb-2088-4e2c-b1a4-4c4b631f0153 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.923191] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance cdca51b4-b059-48b6-ae81-ced1a447f10d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.935651] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1a04a0fd-11d5-4fce-ba32-d90e39a13ff9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.946494] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.957318] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3966e1f7-2107-4ddb-8077-ab37ef1a9b92 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.968046] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c57df23b-3348-41fa-a976-421f98cab569 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.978280] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c769a8f3-6f9f-4e5b-bfec-345c97da5d83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1068.978506] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1068.978651] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1069.031063] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819154, 'name': CreateVM_Task, 'duration_secs': 0.248206} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1069.031300] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1069.033787] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1069.033950] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1069.034273] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1069.034711] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-50c12f2e-4335-4456-b9cc-758c168726de {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.038994] env[67169]: DEBUG oslo_vmware.api [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Waiting for the task: (returnval){ [ 1069.038994] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52274788-a693-4c10-378a-26cd6f83547a" [ 1069.038994] env[67169]: _type = "Task" [ 1069.038994] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1069.046926] env[67169]: DEBUG oslo_vmware.api [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52274788-a693-4c10-378a-26cd6f83547a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1069.260660] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02c104f0-31c2-4daf-a60f-78224717cefa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.268178] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b47bfc0c-509d-4edd-bfb1-c3591003b52f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.296882] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9643fd92-cc8f-485e-8475-b48f30b16929 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.303483] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87b3e031-cf3c-4539-a36f-942f980bf013 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.315903] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1069.325494] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1069.340685] env[67169]: DEBUG oslo_vmware.exceptions [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1069.340946] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Releasing lock "[datastore1] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1069.341515] env[67169]: ERROR nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1069.341515] env[67169]: Faults: ['InvalidArgument'] [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] Traceback (most recent call last): [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] yield resources [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] self.driver.spawn(context, instance, image_meta, [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] self._fetch_image_if_missing(context, vi) [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] image_cache(vi, tmp_image_ds_loc) [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] vm_util.copy_virtual_disk( [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] session._wait_for_task(vmdk_copy_task) [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] return self.wait_for_task(task_ref) [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] return evt.wait() [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] result = hub.switch() [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] return self.greenlet.switch() [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] self.f(*self.args, **self.kw) [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] raise exceptions.translate_fault(task_info.error) [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] Faults: ['InvalidArgument'] [ 1069.341515] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] [ 1069.342482] env[67169]: INFO nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Terminating instance [ 1069.343827] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1069.343997] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.610s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1069.344540] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1069.344728] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1069.345688] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c186d9a2-1007-4cd8-933e-50ce02d05b40 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.351880] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1069.352097] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-68508df2-3de5-4428-9041-f75eea78c298 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.420492] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1069.420707] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Deleting contents of the VM from datastore datastore1 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1069.420882] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Deleting the datastore file [datastore1] 310ae1ce-4717-4807-901c-5674677682c3 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1069.421248] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2b4e972b-a74d-4856-8b3f-250979b37a01 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1069.427153] env[67169]: DEBUG oslo_vmware.api [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Waiting for the task: (returnval){ [ 1069.427153] env[67169]: value = "task-2819157" [ 1069.427153] env[67169]: _type = "Task" [ 1069.427153] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1069.434580] env[67169]: DEBUG oslo_vmware.api [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Task: {'id': task-2819157, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1069.548534] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1069.548810] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1069.549055] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1069.937402] env[67169]: DEBUG oslo_vmware.api [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Task: {'id': task-2819157, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064042} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1069.937809] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1069.937884] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Deleted contents of the VM from datastore datastore1 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1069.938081] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1069.938208] env[67169]: INFO nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1069.940736] env[67169]: DEBUG nova.compute.claims [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1069.940917] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1069.941145] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1070.258999] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bf25187-5b83-48f8-9bdb-7437b8e7d68f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.266917] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d292300e-c5e5-4efc-9964-3535f87c74b9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.295877] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ced913c-40bb-4ab2-b699-87090f6c97ad {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.302905] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13ee48a1-8848-45c3-8a06-a1d6b5d479d5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.317021] env[67169]: DEBUG nova.compute.provider_tree [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1070.324706] env[67169]: DEBUG nova.scheduler.client.report [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1070.337615] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.396s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1070.338130] env[67169]: ERROR nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1070.338130] env[67169]: Faults: ['InvalidArgument'] [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] Traceback (most recent call last): [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] self.driver.spawn(context, instance, image_meta, [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] self._fetch_image_if_missing(context, vi) [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] image_cache(vi, tmp_image_ds_loc) [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] vm_util.copy_virtual_disk( [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] session._wait_for_task(vmdk_copy_task) [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] return self.wait_for_task(task_ref) [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] return evt.wait() [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] result = hub.switch() [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] return self.greenlet.switch() [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] self.f(*self.args, **self.kw) [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] raise exceptions.translate_fault(task_info.error) [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] Faults: ['InvalidArgument'] [ 1070.338130] env[67169]: ERROR nova.compute.manager [instance: 310ae1ce-4717-4807-901c-5674677682c3] [ 1070.338980] env[67169]: DEBUG nova.compute.utils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1070.340113] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Build of instance 310ae1ce-4717-4807-901c-5674677682c3 was re-scheduled: A specified parameter was not correct: fileType [ 1070.340113] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1070.340481] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1070.340653] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1070.340820] env[67169]: DEBUG nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1070.340981] env[67169]: DEBUG nova.network.neutron [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1070.345209] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1070.345399] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1070.345534] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1070.627684] env[67169]: DEBUG nova.network.neutron [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1070.641051] env[67169]: INFO nova.compute.manager [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Took 0.30 seconds to deallocate network for instance. [ 1070.653091] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1070.658710] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1070.658863] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1070.659019] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1070.679520] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1070.679669] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1070.679803] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1070.679932] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1070.680071] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1070.680198] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1070.680329] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1070.680443] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1070.680564] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1070.680684] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1070.723952] env[67169]: INFO nova.scheduler.client.report [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Deleted allocations for instance 310ae1ce-4717-4807-901c-5674677682c3 [ 1070.743852] env[67169]: DEBUG oslo_concurrency.lockutils [None req-fbe3d5d6-785f-47de-98ab-b4e9aae11143 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "310ae1ce-4717-4807-901c-5674677682c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.241s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1070.745056] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "310ae1ce-4717-4807-901c-5674677682c3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 43.935s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1070.745295] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Acquiring lock "310ae1ce-4717-4807-901c-5674677682c3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1070.745521] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "310ae1ce-4717-4807-901c-5674677682c3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1070.746188] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "310ae1ce-4717-4807-901c-5674677682c3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1070.748265] env[67169]: INFO nova.compute.manager [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Terminating instance [ 1070.750656] env[67169]: DEBUG nova.compute.manager [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1070.750969] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1070.751130] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e65f446e-a386-4c65-ac41-03afa938d161 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.762530] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0838b45-edd1-4366-a723-0b9dfad78872 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1070.773792] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1070.794992] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 310ae1ce-4717-4807-901c-5674677682c3 could not be found. [ 1070.795233] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1070.795413] env[67169]: INFO nova.compute.manager [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1070.795656] env[67169]: DEBUG oslo.service.loopingcall [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1070.795880] env[67169]: DEBUG nova.compute.manager [-] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1070.795978] env[67169]: DEBUG nova.network.neutron [-] [instance: 310ae1ce-4717-4807-901c-5674677682c3] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1070.826854] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1070.827077] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1070.828587] env[67169]: INFO nova.compute.claims [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1070.831717] env[67169]: DEBUG nova.network.neutron [-] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1070.840548] env[67169]: INFO nova.compute.manager [-] [instance: 310ae1ce-4717-4807-901c-5674677682c3] Took 0.04 seconds to deallocate network for instance. [ 1070.946125] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c59878d0-f22c-4587-867a-88ea565f4462 tempest-InstanceActionsTestJSON-34793226 tempest-InstanceActionsTestJSON-34793226-project-member] Lock "310ae1ce-4717-4807-901c-5674677682c3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.201s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1071.177985] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0047aab6-5663-4e25-8ac5-7c3594837f63 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.185626] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4fea1a4-f3dd-469c-ba09-2b558ad25835 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.216288] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f205568a-f571-4f22-baec-95b42cec4646 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.224878] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db660d8b-d716-4960-94ea-c8936ed5e2a7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.239341] env[67169]: DEBUG nova.compute.provider_tree [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1071.248273] env[67169]: DEBUG nova.scheduler.client.report [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1071.283788] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.457s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1071.284329] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1071.317996] env[67169]: DEBUG nova.compute.utils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1071.319475] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1071.319671] env[67169]: DEBUG nova.network.neutron [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1071.327866] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1071.395871] env[67169]: DEBUG nova.policy [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc8f12a2682c4b79aabc2f87ed8678e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a5d2ec974f664a3a9407f7f3e86b4982', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1071.398911] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1071.424592] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1071.424949] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1071.425130] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1071.425571] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1071.425763] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1071.425920] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1071.426151] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1071.426315] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1071.426483] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1071.426644] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1071.426815] env[67169]: DEBUG nova.virt.hardware [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1071.427686] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09bd45e8-cd28-41cf-a902-7b714cc37b0c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.436114] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e86e26a-7414-4149-bd70-c9819aa9d119 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1071.467802] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1071.658980] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1071.737147] env[67169]: DEBUG nova.network.neutron [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Successfully created port: a834f5ab-8635-4378-a48d-7da141168b71 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1072.658941] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1072.826882] env[67169]: DEBUG nova.compute.manager [req-7c2decba-4716-4cb0-aba7-2a23033c2ba9 req-2a9f84cd-ec80-4fbe-9295-c0810fa1f030 service nova] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Received event network-vif-plugged-a834f5ab-8635-4378-a48d-7da141168b71 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1072.827118] env[67169]: DEBUG oslo_concurrency.lockutils [req-7c2decba-4716-4cb0-aba7-2a23033c2ba9 req-2a9f84cd-ec80-4fbe-9295-c0810fa1f030 service nova] Acquiring lock "7bf839c0-3ec8-4329-823d-de1fae4833cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1072.827407] env[67169]: DEBUG oslo_concurrency.lockutils [req-7c2decba-4716-4cb0-aba7-2a23033c2ba9 req-2a9f84cd-ec80-4fbe-9295-c0810fa1f030 service nova] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1072.827492] env[67169]: DEBUG oslo_concurrency.lockutils [req-7c2decba-4716-4cb0-aba7-2a23033c2ba9 req-2a9f84cd-ec80-4fbe-9295-c0810fa1f030 service nova] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1072.827655] env[67169]: DEBUG nova.compute.manager [req-7c2decba-4716-4cb0-aba7-2a23033c2ba9 req-2a9f84cd-ec80-4fbe-9295-c0810fa1f030 service nova] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] No waiting events found dispatching network-vif-plugged-a834f5ab-8635-4378-a48d-7da141168b71 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1072.827812] env[67169]: WARNING nova.compute.manager [req-7c2decba-4716-4cb0-aba7-2a23033c2ba9 req-2a9f84cd-ec80-4fbe-9295-c0810fa1f030 service nova] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Received unexpected event network-vif-plugged-a834f5ab-8635-4378-a48d-7da141168b71 for instance with vm_state building and task_state spawning. [ 1072.925146] env[67169]: DEBUG nova.network.neutron [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Successfully updated port: a834f5ab-8635-4378-a48d-7da141168b71 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1072.937831] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "refresh_cache-7bf839c0-3ec8-4329-823d-de1fae4833cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1072.937987] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "refresh_cache-7bf839c0-3ec8-4329-823d-de1fae4833cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1072.938187] env[67169]: DEBUG nova.network.neutron [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1073.002491] env[67169]: DEBUG nova.network.neutron [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1073.198008] env[67169]: DEBUG nova.network.neutron [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Updating instance_info_cache with network_info: [{"id": "a834f5ab-8635-4378-a48d-7da141168b71", "address": "fa:16:3e:03:75:ae", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa834f5ab-86", "ovs_interfaceid": "a834f5ab-8635-4378-a48d-7da141168b71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1073.217421] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "refresh_cache-7bf839c0-3ec8-4329-823d-de1fae4833cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1073.217421] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Instance network_info: |[{"id": "a834f5ab-8635-4378-a48d-7da141168b71", "address": "fa:16:3e:03:75:ae", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa834f5ab-86", "ovs_interfaceid": "a834f5ab-8635-4378-a48d-7da141168b71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1073.217983] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:03:75:ae', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '56398cc0-e39f-410f-8036-8c2a6870e26f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a834f5ab-8635-4378-a48d-7da141168b71', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1073.225596] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating folder: Project (a5d2ec974f664a3a9407f7f3e86b4982). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1073.226273] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-202d568d-c8f9-4911-8f50-8a9c224bd64f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.237665] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Created folder: Project (a5d2ec974f664a3a9407f7f3e86b4982) in parent group-v566843. [ 1073.237862] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating folder: Instances. Parent ref: group-v566910. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1073.238094] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7c446f21-c6ff-40d1-b416-39400157420e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.250132] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Created folder: Instances in parent group-v566910. [ 1073.250132] env[67169]: DEBUG oslo.service.loopingcall [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1073.250132] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1073.250132] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-34e5c20d-6603-4d45-aa86-15e343f30447 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.270375] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1073.270375] env[67169]: value = "task-2819160" [ 1073.270375] env[67169]: _type = "Task" [ 1073.270375] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1073.279626] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819160, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1073.659310] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1073.780611] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819160, 'name': CreateVM_Task, 'duration_secs': 0.281406} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1073.780793] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1073.781694] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1073.781865] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1073.782248] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1073.782547] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-49efd404-29fd-4807-b5c5-51d89583a0af {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.787516] env[67169]: DEBUG oslo_vmware.api [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 1073.787516] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52663418-4929-5b33-552f-c07e50c4ab52" [ 1073.787516] env[67169]: _type = "Task" [ 1073.787516] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1073.796543] env[67169]: DEBUG oslo_vmware.api [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52663418-4929-5b33-552f-c07e50c4ab52, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1074.298663] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1074.298663] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1074.298663] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1074.924698] env[67169]: DEBUG nova.compute.manager [req-e1c6a8ee-0eba-4fce-aea0-0966918907f8 req-90c5d89f-0011-43c6-a99d-4524f548cf6c service nova] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Received event network-changed-a834f5ab-8635-4378-a48d-7da141168b71 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1074.924934] env[67169]: DEBUG nova.compute.manager [req-e1c6a8ee-0eba-4fce-aea0-0966918907f8 req-90c5d89f-0011-43c6-a99d-4524f548cf6c service nova] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Refreshing instance network info cache due to event network-changed-a834f5ab-8635-4378-a48d-7da141168b71. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1074.925139] env[67169]: DEBUG oslo_concurrency.lockutils [req-e1c6a8ee-0eba-4fce-aea0-0966918907f8 req-90c5d89f-0011-43c6-a99d-4524f548cf6c service nova] Acquiring lock "refresh_cache-7bf839c0-3ec8-4329-823d-de1fae4833cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1074.925284] env[67169]: DEBUG oslo_concurrency.lockutils [req-e1c6a8ee-0eba-4fce-aea0-0966918907f8 req-90c5d89f-0011-43c6-a99d-4524f548cf6c service nova] Acquired lock "refresh_cache-7bf839c0-3ec8-4329-823d-de1fae4833cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1074.925449] env[67169]: DEBUG nova.network.neutron [req-e1c6a8ee-0eba-4fce-aea0-0966918907f8 req-90c5d89f-0011-43c6-a99d-4524f548cf6c service nova] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Refreshing network info cache for port a834f5ab-8635-4378-a48d-7da141168b71 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1075.320530] env[67169]: DEBUG nova.network.neutron [req-e1c6a8ee-0eba-4fce-aea0-0966918907f8 req-90c5d89f-0011-43c6-a99d-4524f548cf6c service nova] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Updated VIF entry in instance network info cache for port a834f5ab-8635-4378-a48d-7da141168b71. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1075.322380] env[67169]: DEBUG nova.network.neutron [req-e1c6a8ee-0eba-4fce-aea0-0966918907f8 req-90c5d89f-0011-43c6-a99d-4524f548cf6c service nova] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Updating instance_info_cache with network_info: [{"id": "a834f5ab-8635-4378-a48d-7da141168b71", "address": "fa:16:3e:03:75:ae", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa834f5ab-86", "ovs_interfaceid": "a834f5ab-8635-4378-a48d-7da141168b71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1075.330938] env[67169]: DEBUG oslo_concurrency.lockutils [req-e1c6a8ee-0eba-4fce-aea0-0966918907f8 req-90c5d89f-0011-43c6-a99d-4524f548cf6c service nova] Releasing lock "refresh_cache-7bf839c0-3ec8-4329-823d-de1fae4833cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1078.192199] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "7817b417-599c-4619-8bd3-28d2e8236b9f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1078.193031] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "7817b417-599c-4619-8bd3-28d2e8236b9f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1078.220228] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "4f57a0db-fe0b-4983-9e07-62485a53f918" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1078.220484] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "4f57a0db-fe0b-4983-9e07-62485a53f918" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1081.571402] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a549810-e7cd-467a-a3e5-26aa35a98b4a tempest-ServerRescueTestJSONUnderV235-1040427181 tempest-ServerRescueTestJSONUnderV235-1040427181-project-member] Acquiring lock "3930edcb-c8ce-44f4-84ae-a2b59f99bc82" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1081.572032] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a549810-e7cd-467a-a3e5-26aa35a98b4a tempest-ServerRescueTestJSONUnderV235-1040427181 tempest-ServerRescueTestJSONUnderV235-1040427181-project-member] Lock "3930edcb-c8ce-44f4-84ae-a2b59f99bc82" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1114.211833] env[67169]: WARNING oslo_vmware.rw_handles [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1114.211833] env[67169]: ERROR oslo_vmware.rw_handles [ 1114.212439] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/028de748-8cfe-485d-ae8e-8db51841361b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1114.214537] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1114.214817] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Copying Virtual Disk [datastore2] vmware_temp/028de748-8cfe-485d-ae8e-8db51841361b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/028de748-8cfe-485d-ae8e-8db51841361b/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1114.215144] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1f313878-51af-48b3-8e74-939ccb274d37 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.222910] env[67169]: DEBUG oslo_vmware.api [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Waiting for the task: (returnval){ [ 1114.222910] env[67169]: value = "task-2819161" [ 1114.222910] env[67169]: _type = "Task" [ 1114.222910] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1114.230604] env[67169]: DEBUG oslo_vmware.api [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Task: {'id': task-2819161, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1114.733823] env[67169]: DEBUG oslo_vmware.exceptions [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1114.736031] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1114.736031] env[67169]: ERROR nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1114.736031] env[67169]: Faults: ['InvalidArgument'] [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Traceback (most recent call last): [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] yield resources [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] self.driver.spawn(context, instance, image_meta, [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] self._fetch_image_if_missing(context, vi) [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] image_cache(vi, tmp_image_ds_loc) [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] vm_util.copy_virtual_disk( [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] session._wait_for_task(vmdk_copy_task) [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] return self.wait_for_task(task_ref) [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] return evt.wait() [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] result = hub.switch() [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] return self.greenlet.switch() [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] self.f(*self.args, **self.kw) [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] raise exceptions.translate_fault(task_info.error) [ 1114.736031] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1114.737089] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Faults: ['InvalidArgument'] [ 1114.737089] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] [ 1114.737089] env[67169]: INFO nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Terminating instance [ 1114.737089] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1114.737089] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1114.737324] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-61dc3edd-4884-4f34-a0ed-3962cebe429b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.739441] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1114.739630] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1114.740371] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bccaf9c-fc34-4a99-9595-b34bf7c93635 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.747076] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1114.747278] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b09429d4-6dbc-4178-ae3c-2d22d9863208 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.749325] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1114.749537] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1114.750444] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-11b4ba61-5a6c-4d96-8977-2f6719b65515 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.754890] env[67169]: DEBUG oslo_vmware.api [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Waiting for the task: (returnval){ [ 1114.754890] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]522137ad-3081-6358-c8b6-494b02542997" [ 1114.754890] env[67169]: _type = "Task" [ 1114.754890] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1114.761729] env[67169]: DEBUG oslo_vmware.api [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]522137ad-3081-6358-c8b6-494b02542997, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1114.811445] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1114.811668] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1114.811873] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Deleting the datastore file [datastore2] 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1114.812149] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d2371ad1-92b1-4b4d-8b44-79389516af22 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.817706] env[67169]: DEBUG oslo_vmware.api [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Waiting for the task: (returnval){ [ 1114.817706] env[67169]: value = "task-2819163" [ 1114.817706] env[67169]: _type = "Task" [ 1114.817706] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1114.825349] env[67169]: DEBUG oslo_vmware.api [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Task: {'id': task-2819163, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1115.265455] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1115.265719] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Creating directory with path [datastore2] vmware_temp/7d7f33fc-cae7-4ca4-9381-1d5b59dcbf75/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1115.265927] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa307fba-e064-45e5-a864-e021af148962 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.277256] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Created directory with path [datastore2] vmware_temp/7d7f33fc-cae7-4ca4-9381-1d5b59dcbf75/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1115.277451] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Fetch image to [datastore2] vmware_temp/7d7f33fc-cae7-4ca4-9381-1d5b59dcbf75/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1115.277619] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/7d7f33fc-cae7-4ca4-9381-1d5b59dcbf75/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1115.278389] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83462ea9-cf54-4f0c-be2b-e3d9151931db {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.284900] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e472543-4fad-4880-b721-add91e4cc49e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.295172] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b4982f5-e65e-49d5-8e91-1b124b47660a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.328757] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-384864d1-632d-45d3-a72e-1e9a177fb012 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.336029] env[67169]: DEBUG oslo_vmware.api [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Task: {'id': task-2819163, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072558} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1115.337498] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1115.337694] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1115.337865] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1115.338060] env[67169]: INFO nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1115.339848] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a254c00e-3ab0-4b63-b525-a59a196978ed {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.342034] env[67169]: DEBUG nova.compute.claims [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1115.342230] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1115.342455] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1115.363947] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1115.426520] env[67169]: DEBUG oslo_vmware.rw_handles [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7d7f33fc-cae7-4ca4-9381-1d5b59dcbf75/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1115.485101] env[67169]: DEBUG oslo_vmware.rw_handles [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1115.485348] env[67169]: DEBUG oslo_vmware.rw_handles [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7d7f33fc-cae7-4ca4-9381-1d5b59dcbf75/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1115.722656] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1599c43e-75f6-4b88-adde-dc6c87f41b33 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.730231] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0cb8b41-4019-4fcb-880a-9ee3a7a0fbce {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.760726] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4150419f-b12d-4aad-877e-baba06aa69dd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.767503] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c5045fa-ae7c-4aaf-811e-c7f4880cf580 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.780021] env[67169]: DEBUG nova.compute.provider_tree [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1115.788742] env[67169]: DEBUG nova.scheduler.client.report [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1115.806336] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.464s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1115.806858] env[67169]: ERROR nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1115.806858] env[67169]: Faults: ['InvalidArgument'] [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Traceback (most recent call last): [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] self.driver.spawn(context, instance, image_meta, [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] self._fetch_image_if_missing(context, vi) [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] image_cache(vi, tmp_image_ds_loc) [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] vm_util.copy_virtual_disk( [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] session._wait_for_task(vmdk_copy_task) [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] return self.wait_for_task(task_ref) [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] return evt.wait() [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] result = hub.switch() [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] return self.greenlet.switch() [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] self.f(*self.args, **self.kw) [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] raise exceptions.translate_fault(task_info.error) [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Faults: ['InvalidArgument'] [ 1115.806858] env[67169]: ERROR nova.compute.manager [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] [ 1115.807765] env[67169]: DEBUG nova.compute.utils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1115.808843] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Build of instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 was re-scheduled: A specified parameter was not correct: fileType [ 1115.808843] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1115.809232] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1115.809402] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1115.809576] env[67169]: DEBUG nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1115.809737] env[67169]: DEBUG nova.network.neutron [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1116.096804] env[67169]: DEBUG nova.network.neutron [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1116.113027] env[67169]: INFO nova.compute.manager [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Took 0.30 seconds to deallocate network for instance. [ 1116.206017] env[67169]: INFO nova.scheduler.client.report [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Deleted allocations for instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 [ 1116.227681] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1d27fda4-25e5-4bf7-a655-5b02544fb343 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 521.181s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1116.229017] env[67169]: DEBUG oslo_concurrency.lockutils [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 317.334s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1116.229277] env[67169]: DEBUG oslo_concurrency.lockutils [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Acquiring lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1116.229482] env[67169]: DEBUG oslo_concurrency.lockutils [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1116.231811] env[67169]: DEBUG oslo_concurrency.lockutils [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1116.231811] env[67169]: INFO nova.compute.manager [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Terminating instance [ 1116.235018] env[67169]: DEBUG nova.compute.manager [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1116.235018] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1116.235018] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fbbaa31a-a912-4d8a-9178-7e29cad44385 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.243390] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a21476b4-1b3a-4a3c-bab3-7ffb273b2dc8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.254649] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1116.276229] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4 could not be found. [ 1116.276561] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1116.276654] env[67169]: INFO nova.compute.manager [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1116.276902] env[67169]: DEBUG oslo.service.loopingcall [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1116.277147] env[67169]: DEBUG nova.compute.manager [-] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1116.277247] env[67169]: DEBUG nova.network.neutron [-] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1116.308237] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1116.308471] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1116.310031] env[67169]: INFO nova.compute.claims [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1116.420814] env[67169]: DEBUG nova.network.neutron [-] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1116.429535] env[67169]: INFO nova.compute.manager [-] [instance: 84577fe1-6a7f-4f1e-a262-0ea7c0576cc4] Took 0.15 seconds to deallocate network for instance. [ 1116.523728] env[67169]: DEBUG oslo_concurrency.lockutils [None req-107e28cc-83b9-4df1-8569-844ba6d1e041 tempest-ServersAdminTestJSON-608063042 tempest-ServersAdminTestJSON-608063042-project-member] Lock "84577fe1-6a7f-4f1e-a262-0ea7c0576cc4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.295s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1116.691284] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b43bc5ad-32d1-43b2-98f7-1e744613c331 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.698367] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-301f8ea9-63ef-4f28-9d4b-47afb8672c35 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.728889] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14c1c722-3f90-4065-b630-65ba2b4f96e0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.736561] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ee96a9e-3dd9-412d-b15f-072eda366747 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.749248] env[67169]: DEBUG nova.compute.provider_tree [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1116.758093] env[67169]: DEBUG nova.scheduler.client.report [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1116.773816] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.465s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1116.774313] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1116.807884] env[67169]: DEBUG nova.compute.utils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1116.810388] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1116.810568] env[67169]: DEBUG nova.network.neutron [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1116.819480] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1116.884689] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1116.900844] env[67169]: DEBUG nova.policy [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e9c2197f3999457fb888187ba3b5b85b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a38873ee6c9442fb8d93a3abbe4d170f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1116.911088] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1116.911339] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1116.911497] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1116.911679] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1116.911852] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1116.912017] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1116.912229] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1116.912385] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1116.912544] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1116.912701] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1116.912870] env[67169]: DEBUG nova.virt.hardware [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1116.913724] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ba348de-ae45-4141-82a2-9e0e863abca5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.921696] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-965c848a-36e5-4f73-91cf-d94f4f53f3d2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1117.223224] env[67169]: DEBUG nova.network.neutron [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Successfully created port: d83b3959-4c29-4790-b4ae-e983b6aa29f0 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1118.029295] env[67169]: DEBUG nova.network.neutron [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Successfully updated port: d83b3959-4c29-4790-b4ae-e983b6aa29f0 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1118.039988] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquiring lock "refresh_cache-bab5d630-fec0-44e5-8088-12c8855aad66" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1118.040153] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquired lock "refresh_cache-bab5d630-fec0-44e5-8088-12c8855aad66" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1118.040324] env[67169]: DEBUG nova.network.neutron [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1118.109173] env[67169]: DEBUG nova.network.neutron [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1118.314998] env[67169]: DEBUG nova.network.neutron [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Updating instance_info_cache with network_info: [{"id": "d83b3959-4c29-4790-b4ae-e983b6aa29f0", "address": "fa:16:3e:dd:1d:78", "network": {"id": "db790e9a-fab5-4121-933c-abb8144efde0", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1685163003-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a38873ee6c9442fb8d93a3abbe4d170f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "62237242-7ce2-4664-a1c5-6783b516b507", "external-id": "nsx-vlan-transportzone-295", "segmentation_id": 295, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd83b3959-4c", "ovs_interfaceid": "d83b3959-4c29-4790-b4ae-e983b6aa29f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1118.327592] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Releasing lock "refresh_cache-bab5d630-fec0-44e5-8088-12c8855aad66" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1118.327895] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Instance network_info: |[{"id": "d83b3959-4c29-4790-b4ae-e983b6aa29f0", "address": "fa:16:3e:dd:1d:78", "network": {"id": "db790e9a-fab5-4121-933c-abb8144efde0", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1685163003-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a38873ee6c9442fb8d93a3abbe4d170f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "62237242-7ce2-4664-a1c5-6783b516b507", "external-id": "nsx-vlan-transportzone-295", "segmentation_id": 295, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd83b3959-4c", "ovs_interfaceid": "d83b3959-4c29-4790-b4ae-e983b6aa29f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1118.328295] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:dd:1d:78', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '62237242-7ce2-4664-a1c5-6783b516b507', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd83b3959-4c29-4790-b4ae-e983b6aa29f0', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1118.335834] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Creating folder: Project (a38873ee6c9442fb8d93a3abbe4d170f). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1118.336346] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3ff9a75f-9b71-4e85-acb1-b628173d2ffc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.347365] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Created folder: Project (a38873ee6c9442fb8d93a3abbe4d170f) in parent group-v566843. [ 1118.347511] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Creating folder: Instances. Parent ref: group-v566913. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1118.347723] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-68161a56-0287-4d5c-9528-184a5852e84b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.355182] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Created folder: Instances in parent group-v566913. [ 1118.355400] env[67169]: DEBUG oslo.service.loopingcall [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1118.355571] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1118.355755] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-067c457b-5f30-4b64-96e4-72bb85535367 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.374310] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1118.374310] env[67169]: value = "task-2819166" [ 1118.374310] env[67169]: _type = "Task" [ 1118.374310] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1118.385803] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819166, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1118.479803] env[67169]: DEBUG nova.compute.manager [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Received event network-vif-plugged-d83b3959-4c29-4790-b4ae-e983b6aa29f0 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1118.479975] env[67169]: DEBUG oslo_concurrency.lockutils [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] Acquiring lock "bab5d630-fec0-44e5-8088-12c8855aad66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1118.480214] env[67169]: DEBUG oslo_concurrency.lockutils [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] Lock "bab5d630-fec0-44e5-8088-12c8855aad66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1118.480373] env[67169]: DEBUG oslo_concurrency.lockutils [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] Lock "bab5d630-fec0-44e5-8088-12c8855aad66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1118.480538] env[67169]: DEBUG nova.compute.manager [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] No waiting events found dispatching network-vif-plugged-d83b3959-4c29-4790-b4ae-e983b6aa29f0 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1118.480699] env[67169]: WARNING nova.compute.manager [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Received unexpected event network-vif-plugged-d83b3959-4c29-4790-b4ae-e983b6aa29f0 for instance with vm_state building and task_state spawning. [ 1118.480854] env[67169]: DEBUG nova.compute.manager [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Received event network-changed-d83b3959-4c29-4790-b4ae-e983b6aa29f0 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1118.481055] env[67169]: DEBUG nova.compute.manager [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Refreshing instance network info cache due to event network-changed-d83b3959-4c29-4790-b4ae-e983b6aa29f0. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1118.481274] env[67169]: DEBUG oslo_concurrency.lockutils [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] Acquiring lock "refresh_cache-bab5d630-fec0-44e5-8088-12c8855aad66" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1118.481414] env[67169]: DEBUG oslo_concurrency.lockutils [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] Acquired lock "refresh_cache-bab5d630-fec0-44e5-8088-12c8855aad66" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1118.481573] env[67169]: DEBUG nova.network.neutron [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Refreshing network info cache for port d83b3959-4c29-4790-b4ae-e983b6aa29f0 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1118.884191] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819166, 'name': CreateVM_Task, 'duration_secs': 0.270816} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1118.884377] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1118.885024] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1118.885192] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1118.885509] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1118.885768] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0c1891a4-139a-4524-9587-80e4099bef0b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.890281] env[67169]: DEBUG oslo_vmware.api [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Waiting for the task: (returnval){ [ 1118.890281] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5274988a-63fe-b945-a651-1d0d011dc04e" [ 1118.890281] env[67169]: _type = "Task" [ 1118.890281] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1118.899074] env[67169]: DEBUG oslo_vmware.api [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5274988a-63fe-b945-a651-1d0d011dc04e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1119.034944] env[67169]: DEBUG nova.network.neutron [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Updated VIF entry in instance network info cache for port d83b3959-4c29-4790-b4ae-e983b6aa29f0. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1119.035350] env[67169]: DEBUG nova.network.neutron [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Updating instance_info_cache with network_info: [{"id": "d83b3959-4c29-4790-b4ae-e983b6aa29f0", "address": "fa:16:3e:dd:1d:78", "network": {"id": "db790e9a-fab5-4121-933c-abb8144efde0", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1685163003-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a38873ee6c9442fb8d93a3abbe4d170f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "62237242-7ce2-4664-a1c5-6783b516b507", "external-id": "nsx-vlan-transportzone-295", "segmentation_id": 295, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd83b3959-4c", "ovs_interfaceid": "d83b3959-4c29-4790-b4ae-e983b6aa29f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1119.047849] env[67169]: DEBUG oslo_concurrency.lockutils [req-91f0d592-f201-427b-88b4-681a2954b2d3 req-81cc9d61-4ceb-4af8-b6ae-78de60f844ae service nova] Releasing lock "refresh_cache-bab5d630-fec0-44e5-8088-12c8855aad66" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1119.401051] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1119.401322] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1119.401539] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1123.829651] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "7bf839c0-3ec8-4329-823d-de1fae4833cb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1124.659273] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1126.243435] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquiring lock "883a792f-ae72-4475-8592-3076c2c2c2ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1126.243781] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "883a792f-ae72-4475-8592-3076c2c2c2ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1129.658696] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1129.659118] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1130.654337] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1130.678247] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1130.678247] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1130.678247] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1130.698025] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698025] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698025] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698025] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698025] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698025] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698301] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698301] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698373] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698483] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1130.698598] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1130.699082] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1130.709600] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1130.709806] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1130.710014] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1130.710174] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1130.711308] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fbd3245-d95d-4c1e-b747-f7ee34db2ff5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.720360] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a15b4206-439e-4dad-aba3-3f9513f87616 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.734656] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75bf8e6c-be2a-4b83-a3e0-1064cf4cbfa2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.740916] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaa7815e-0d82-4a4c-8b4c-0154903a5556 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.771297] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180997MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1130.771407] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1130.771587] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1130.846237] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 28552f70-695d-40cc-8dfa-bf40d6113220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.846408] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.846536] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.846657] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.846775] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.846903] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.847031] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.847160] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.847277] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.847389] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1130.858445] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.868632] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ca657a42-3745-46e1-8fc9-61de31f661d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.877848] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3f90c9a4-650d-4280-b155-1315d2f0f281 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.887396] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance d964ad35-8d3f-45f3-b799-aebddf295012 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.896115] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0b78afae-71e9-4ba9-903a-03c8a98cd91e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.905246] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 54b1337f-4ac8-4718-b273-2f078782b491 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.914354] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bf6857fb-2088-4e2c-b1a4-4c4b631f0153 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.922940] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance cdca51b4-b059-48b6-ae81-ced1a447f10d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.931404] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1a04a0fd-11d5-4fce-ba32-d90e39a13ff9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.940271] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.949812] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3966e1f7-2107-4ddb-8077-ab37ef1a9b92 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.959355] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c57df23b-3348-41fa-a976-421f98cab569 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.968651] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c769a8f3-6f9f-4e5b-bfec-345c97da5d83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.977276] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.986287] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 4f57a0db-fe0b-4983-9e07-62485a53f918 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1130.995136] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3930edcb-c8ce-44f4-84ae-a2b59f99bc82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.003726] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1131.003949] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1131.004114] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1131.303803] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1a21812-ac1e-4119-9a91-544b6521d0e3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.311239] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4341fb60-4f61-4401-beb3-4398660d181c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.341089] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97fb6412-b969-4914-8506-dc2fd16e1cd4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.348196] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dd04a9b-27df-4296-9582-6d0f00c895aa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.362033] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1131.369859] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1131.383282] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1131.383465] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.612s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1132.343886] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1132.344215] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1132.659075] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1133.658473] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1134.658601] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1135.659622] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1135.659978] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1135.670807] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] There are 0 instances to clean {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1135.671096] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1135.671258] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances with incomplete migration {{(pid=67169) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1135.679402] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1165.478408] env[67169]: WARNING oslo_vmware.rw_handles [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1165.478408] env[67169]: ERROR oslo_vmware.rw_handles [ 1165.479252] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/7d7f33fc-cae7-4ca4-9381-1d5b59dcbf75/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1165.480967] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1165.481232] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Copying Virtual Disk [datastore2] vmware_temp/7d7f33fc-cae7-4ca4-9381-1d5b59dcbf75/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/7d7f33fc-cae7-4ca4-9381-1d5b59dcbf75/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1165.481942] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e5d5a23f-1ffd-4367-8f24-c9d8ee1fab95 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1165.490284] env[67169]: DEBUG oslo_vmware.api [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Waiting for the task: (returnval){ [ 1165.490284] env[67169]: value = "task-2819167" [ 1165.490284] env[67169]: _type = "Task" [ 1165.490284] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1165.499362] env[67169]: DEBUG oslo_vmware.api [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Task: {'id': task-2819167, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1166.000843] env[67169]: DEBUG oslo_vmware.exceptions [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1166.001181] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1166.001817] env[67169]: ERROR nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1166.001817] env[67169]: Faults: ['InvalidArgument'] [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Traceback (most recent call last): [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] yield resources [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] self.driver.spawn(context, instance, image_meta, [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] self._fetch_image_if_missing(context, vi) [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] image_cache(vi, tmp_image_ds_loc) [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] vm_util.copy_virtual_disk( [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] session._wait_for_task(vmdk_copy_task) [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] return self.wait_for_task(task_ref) [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] return evt.wait() [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] result = hub.switch() [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] return self.greenlet.switch() [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] self.f(*self.args, **self.kw) [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] raise exceptions.translate_fault(task_info.error) [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Faults: ['InvalidArgument'] [ 1166.001817] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] [ 1166.003047] env[67169]: INFO nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Terminating instance [ 1166.003752] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1166.004015] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1166.004277] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0a8e561d-dc5a-4901-8fdd-dffd0e90141b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.006693] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1166.006884] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1166.007598] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d7b8d7c-b7ae-44fa-a130-369207ef82af {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.014078] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1166.014315] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2711da21-f682-485d-a68a-717cc01268cc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.016631] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1166.016809] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1166.017821] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-95e73020-4a94-42b8-abb4-142795267822 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.022587] env[67169]: DEBUG oslo_vmware.api [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Waiting for the task: (returnval){ [ 1166.022587] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5274f12a-9fb9-f55c-c44b-16d18c9930fd" [ 1166.022587] env[67169]: _type = "Task" [ 1166.022587] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1166.031278] env[67169]: DEBUG oslo_vmware.api [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5274f12a-9fb9-f55c-c44b-16d18c9930fd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1166.088443] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1166.088705] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1166.088896] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Deleting the datastore file [datastore2] 28552f70-695d-40cc-8dfa-bf40d6113220 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1166.089185] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-057aa589-b65d-4828-a27d-588aed888f73 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.095517] env[67169]: DEBUG oslo_vmware.api [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Waiting for the task: (returnval){ [ 1166.095517] env[67169]: value = "task-2819169" [ 1166.095517] env[67169]: _type = "Task" [ 1166.095517] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1166.103213] env[67169]: DEBUG oslo_vmware.api [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Task: {'id': task-2819169, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1166.533320] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1166.533700] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Creating directory with path [datastore2] vmware_temp/91409eea-5e08-4f1f-8129-3e152c269647/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1166.533861] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-58bacbfd-2b8c-43e3-b2c9-b20ec85d9863 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.544883] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Created directory with path [datastore2] vmware_temp/91409eea-5e08-4f1f-8129-3e152c269647/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1166.545078] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Fetch image to [datastore2] vmware_temp/91409eea-5e08-4f1f-8129-3e152c269647/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1166.545251] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/91409eea-5e08-4f1f-8129-3e152c269647/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1166.546041] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3528672f-74ae-4ad0-b5c9-e9e10306613d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.552301] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fd6e417-cdcf-496a-b772-d733dcd090ca {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.560946] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a924f8f7-c72c-457f-8003-e1a7a062109e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.590801] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14c08093-11bd-4aae-9ff9-ca5c085557d0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.595965] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ef37b25a-56fa-43f3-9c97-b27fe625853a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.605709] env[67169]: DEBUG oslo_vmware.api [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Task: {'id': task-2819169, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076792} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1166.605926] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1166.606115] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1166.606286] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1166.606668] env[67169]: INFO nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1166.608618] env[67169]: DEBUG nova.compute.claims [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1166.608783] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1166.609067] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1166.620045] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1166.673218] env[67169]: DEBUG oslo_vmware.rw_handles [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/91409eea-5e08-4f1f-8129-3e152c269647/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1166.732549] env[67169]: DEBUG oslo_vmware.rw_handles [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1166.732770] env[67169]: DEBUG oslo_vmware.rw_handles [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/91409eea-5e08-4f1f-8129-3e152c269647/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1167.096856] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adc5be1e-0ebb-4676-9b56-a21ae14ce69b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1167.104703] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76bf49a2-f642-4757-a926-588d16d5482e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1167.134132] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be5c29e4-fccd-49c0-9747-75cd20578e7d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1167.141241] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-154e950f-583c-4fca-9076-12bf97301983 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1167.154765] env[67169]: DEBUG nova.compute.provider_tree [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1167.163574] env[67169]: DEBUG nova.scheduler.client.report [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1167.180130] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.571s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1167.180684] env[67169]: ERROR nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1167.180684] env[67169]: Faults: ['InvalidArgument'] [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Traceback (most recent call last): [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] self.driver.spawn(context, instance, image_meta, [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] self._fetch_image_if_missing(context, vi) [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] image_cache(vi, tmp_image_ds_loc) [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] vm_util.copy_virtual_disk( [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] session._wait_for_task(vmdk_copy_task) [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] return self.wait_for_task(task_ref) [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] return evt.wait() [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] result = hub.switch() [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] return self.greenlet.switch() [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] self.f(*self.args, **self.kw) [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] raise exceptions.translate_fault(task_info.error) [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Faults: ['InvalidArgument'] [ 1167.180684] env[67169]: ERROR nova.compute.manager [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] [ 1167.181502] env[67169]: DEBUG nova.compute.utils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1167.182835] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Build of instance 28552f70-695d-40cc-8dfa-bf40d6113220 was re-scheduled: A specified parameter was not correct: fileType [ 1167.182835] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1167.183210] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1167.183390] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1167.183586] env[67169]: DEBUG nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1167.183760] env[67169]: DEBUG nova.network.neutron [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1167.553620] env[67169]: DEBUG nova.network.neutron [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1167.566209] env[67169]: INFO nova.compute.manager [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Took 0.38 seconds to deallocate network for instance. [ 1167.658742] env[67169]: INFO nova.scheduler.client.report [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Deleted allocations for instance 28552f70-695d-40cc-8dfa-bf40d6113220 [ 1167.678389] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4bfbd84-193f-4b41-bf8e-c81aff254f89 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "28552f70-695d-40cc-8dfa-bf40d6113220" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 570.071s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1167.679792] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "28552f70-695d-40cc-8dfa-bf40d6113220" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 370.303s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1167.679792] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Acquiring lock "28552f70-695d-40cc-8dfa-bf40d6113220-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1167.679792] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "28552f70-695d-40cc-8dfa-bf40d6113220-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1167.680037] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "28552f70-695d-40cc-8dfa-bf40d6113220-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1167.681732] env[67169]: INFO nova.compute.manager [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Terminating instance [ 1167.683319] env[67169]: DEBUG nova.compute.manager [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1167.684428] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1167.684428] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0e8b9131-f8d3-4868-920a-7fe6a9e0df74 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1167.693496] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3f1090b-3a50-4214-9dbc-a95237fb0623 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1167.704399] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1167.724709] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 28552f70-695d-40cc-8dfa-bf40d6113220 could not be found. [ 1167.724919] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1167.725109] env[67169]: INFO nova.compute.manager [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1167.725351] env[67169]: DEBUG oslo.service.loopingcall [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1167.725567] env[67169]: DEBUG nova.compute.manager [-] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1167.725662] env[67169]: DEBUG nova.network.neutron [-] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1167.756413] env[67169]: DEBUG nova.network.neutron [-] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1167.758059] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1167.758292] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1167.759688] env[67169]: INFO nova.compute.claims [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1167.764167] env[67169]: INFO nova.compute.manager [-] [instance: 28552f70-695d-40cc-8dfa-bf40d6113220] Took 0.04 seconds to deallocate network for instance. [ 1167.850533] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3156c49f-9684-4f28-902b-e6a3eae3b1f5 tempest-VolumesAssistedSnapshotsTest-1676378715 tempest-VolumesAssistedSnapshotsTest-1676378715-project-member] Lock "28552f70-695d-40cc-8dfa-bf40d6113220" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.171s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1168.194207] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c00fd287-fbfc-4572-8948-ec705e400e85 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.201819] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9af22615-63d8-42c7-a769-3526df8d5dc2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.233059] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78fb9046-230f-423e-93e0-c08a8eebde87 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.239595] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e7e94a5-4b85-4cef-baa4-bc9fc65984e8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.252446] env[67169]: DEBUG nova.compute.provider_tree [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1168.261280] env[67169]: DEBUG nova.scheduler.client.report [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1168.273918] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.516s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1168.274422] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1168.310029] env[67169]: DEBUG nova.compute.utils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1168.311608] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1168.311898] env[67169]: DEBUG nova.network.neutron [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1168.321787] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1168.379877] env[67169]: DEBUG nova.policy [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '615c1061ae884c3b91ce1b072249717c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1162bad4f2e4722aed4ff2c657e9dc9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1168.387330] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1168.412633] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1168.412889] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1168.413064] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1168.413253] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1168.413472] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1168.413646] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1168.413853] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1168.414018] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1168.414186] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1168.414346] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1168.414516] env[67169]: DEBUG nova.virt.hardware [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1168.415390] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebe5b140-b660-4599-8977-bb484014a48a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.423235] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81de1df7-8e84-4cf2-88e2-f5fb6d3984f1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.865232] env[67169]: DEBUG nova.network.neutron [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Successfully created port: d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1169.645786] env[67169]: DEBUG nova.compute.manager [req-5d314692-decc-452f-8ca0-4ae6117f2646 req-cfe6254e-33b5-4b57-b316-460945bb17cd service nova] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Received event network-vif-plugged-d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1169.646026] env[67169]: DEBUG oslo_concurrency.lockutils [req-5d314692-decc-452f-8ca0-4ae6117f2646 req-cfe6254e-33b5-4b57-b316-460945bb17cd service nova] Acquiring lock "a86fa702-2040-4e22-9eaa-5d64bc16f036-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1169.646245] env[67169]: DEBUG oslo_concurrency.lockutils [req-5d314692-decc-452f-8ca0-4ae6117f2646 req-cfe6254e-33b5-4b57-b316-460945bb17cd service nova] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1169.646415] env[67169]: DEBUG oslo_concurrency.lockutils [req-5d314692-decc-452f-8ca0-4ae6117f2646 req-cfe6254e-33b5-4b57-b316-460945bb17cd service nova] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1169.646585] env[67169]: DEBUG nova.compute.manager [req-5d314692-decc-452f-8ca0-4ae6117f2646 req-cfe6254e-33b5-4b57-b316-460945bb17cd service nova] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] No waiting events found dispatching network-vif-plugged-d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1169.646751] env[67169]: WARNING nova.compute.manager [req-5d314692-decc-452f-8ca0-4ae6117f2646 req-cfe6254e-33b5-4b57-b316-460945bb17cd service nova] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Received unexpected event network-vif-plugged-d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24 for instance with vm_state building and task_state spawning. [ 1169.754808] env[67169]: DEBUG nova.network.neutron [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Successfully updated port: d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1169.768730] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "refresh_cache-a86fa702-2040-4e22-9eaa-5d64bc16f036" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1169.769013] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "refresh_cache-a86fa702-2040-4e22-9eaa-5d64bc16f036" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1169.769080] env[67169]: DEBUG nova.network.neutron [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1169.841079] env[67169]: DEBUG nova.network.neutron [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1170.011723] env[67169]: DEBUG nova.network.neutron [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Updating instance_info_cache with network_info: [{"id": "d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24", "address": "fa:16:3e:6c:dd:7f", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd7cf0e2e-7e", "ovs_interfaceid": "d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1170.026704] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "refresh_cache-a86fa702-2040-4e22-9eaa-5d64bc16f036" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1170.027055] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Instance network_info: |[{"id": "d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24", "address": "fa:16:3e:6c:dd:7f", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd7cf0e2e-7e", "ovs_interfaceid": "d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1170.027446] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6c:dd:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '24210a23-d8ac-4f4f-84ac-dc0636de9a72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1170.035431] env[67169]: DEBUG oslo.service.loopingcall [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1170.035980] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1170.036391] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0f6fba38-5d8c-47f3-8ac8-f605982b5d07 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.058956] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1170.058956] env[67169]: value = "task-2819170" [ 1170.058956] env[67169]: _type = "Task" [ 1170.058956] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1170.067841] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819170, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1170.570644] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819170, 'name': CreateVM_Task, 'duration_secs': 0.497736} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1170.570833] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1170.571565] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1170.571736] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1170.572072] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1170.573176] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5dc9ea3a-96e7-4281-9d71-a7a1cc91c06a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.577123] env[67169]: DEBUG oslo_vmware.api [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1170.577123] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f2ab49-5280-8685-6217-453d3f813bb1" [ 1170.577123] env[67169]: _type = "Task" [ 1170.577123] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1170.588624] env[67169]: DEBUG oslo_vmware.api [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f2ab49-5280-8685-6217-453d3f813bb1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1171.087742] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1171.088082] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1171.088224] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1171.708620] env[67169]: DEBUG nova.compute.manager [req-db655032-13c3-4388-bd3a-ab9037327087 req-60c6b88f-47dd-40c6-ac51-f0167396e492 service nova] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Received event network-changed-d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1171.708796] env[67169]: DEBUG nova.compute.manager [req-db655032-13c3-4388-bd3a-ab9037327087 req-60c6b88f-47dd-40c6-ac51-f0167396e492 service nova] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Refreshing instance network info cache due to event network-changed-d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1171.709027] env[67169]: DEBUG oslo_concurrency.lockutils [req-db655032-13c3-4388-bd3a-ab9037327087 req-60c6b88f-47dd-40c6-ac51-f0167396e492 service nova] Acquiring lock "refresh_cache-a86fa702-2040-4e22-9eaa-5d64bc16f036" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1171.709178] env[67169]: DEBUG oslo_concurrency.lockutils [req-db655032-13c3-4388-bd3a-ab9037327087 req-60c6b88f-47dd-40c6-ac51-f0167396e492 service nova] Acquired lock "refresh_cache-a86fa702-2040-4e22-9eaa-5d64bc16f036" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1171.709397] env[67169]: DEBUG nova.network.neutron [req-db655032-13c3-4388-bd3a-ab9037327087 req-60c6b88f-47dd-40c6-ac51-f0167396e492 service nova] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Refreshing network info cache for port d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1171.998627] env[67169]: DEBUG nova.network.neutron [req-db655032-13c3-4388-bd3a-ab9037327087 req-60c6b88f-47dd-40c6-ac51-f0167396e492 service nova] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Updated VIF entry in instance network info cache for port d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1171.998992] env[67169]: DEBUG nova.network.neutron [req-db655032-13c3-4388-bd3a-ab9037327087 req-60c6b88f-47dd-40c6-ac51-f0167396e492 service nova] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Updating instance_info_cache with network_info: [{"id": "d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24", "address": "fa:16:3e:6c:dd:7f", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd7cf0e2e-7e", "ovs_interfaceid": "d7cf0e2e-7ef5-4954-8dbb-dfb1f5902c24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1172.008459] env[67169]: DEBUG oslo_concurrency.lockutils [req-db655032-13c3-4388-bd3a-ab9037327087 req-60c6b88f-47dd-40c6-ac51-f0167396e492 service nova] Releasing lock "refresh_cache-a86fa702-2040-4e22-9eaa-5d64bc16f036" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1174.208911] env[67169]: DEBUG oslo_concurrency.lockutils [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquiring lock "bab5d630-fec0-44e5-8088-12c8855aad66" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1178.370019] env[67169]: DEBUG oslo_concurrency.lockutils [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "a86fa702-2040-4e22-9eaa-5d64bc16f036" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1179.980312] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "48376572-9e3a-4579-b2d7-b8b63312fab1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1179.980610] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1185.656665] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1185.679443] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Getting list of instances from cluster (obj){ [ 1185.679443] env[67169]: value = "domain-c8" [ 1185.679443] env[67169]: _type = "ClusterComputeResource" [ 1185.679443] env[67169]: } {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1185.680767] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c34a4d0-75db-4079-b13d-bddb594c7ed6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1185.697691] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Got total of 10 instances {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1185.697863] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 7a42aeb9-0518-448d-a3a6-8e68d6497922 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.698068] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 43b73a7c-eda8-4239-885f-d4fb8fa6f28a {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.698235] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.698390] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 47ffcce9-3afc-41be-b38e-dacfeb535a2c {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.698541] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 1f0f1960-0c77-4e72-86ee-807819e75d2a {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.698690] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.698839] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.698988] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 7bf839c0-3ec8-4329-823d-de1fae4833cb {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.699155] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid bab5d630-fec0-44e5-8088-12c8855aad66 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.699314] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid a86fa702-2040-4e22-9eaa-5d64bc16f036 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1185.699609] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "7a42aeb9-0518-448d-a3a6-8e68d6497922" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.699836] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.700046] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.700256] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.700496] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "1f0f1960-0c77-4e72-86ee-807819e75d2a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.700669] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.700861] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.701070] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "7bf839c0-3ec8-4329-823d-de1fae4833cb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.701308] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "bab5d630-fec0-44e5-8088-12c8855aad66" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.701454] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "a86fa702-2040-4e22-9eaa-5d64bc16f036" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1185.703954] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1191.660264] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1191.660264] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1191.660264] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1191.685037] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1191.685037] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1191.685037] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1191.697096] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1191.697096] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1191.697096] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1191.697096] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1191.697096] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50975840-d853-4cef-ac22-df34b1aeb0a7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1191.707333] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab267af8-b0fe-46f3-a72e-9b5b3ef3a018 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1191.727443] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94545073-13ef-4870-9837-2346533ebb72 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1191.735094] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1df367fb-4977-4911-9196-5dca8eb3a129 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1191.770111] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180988MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1191.770285] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1191.770486] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1191.948742] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.948921] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.949064] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.949193] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.949315] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.949434] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.949558] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.949676] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.949785] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.949902] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1191.964374] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance d964ad35-8d3f-45f3-b799-aebddf295012 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1191.976886] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0b78afae-71e9-4ba9-903a-03c8a98cd91e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1191.988731] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 54b1337f-4ac8-4718-b273-2f078782b491 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1191.999880] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bf6857fb-2088-4e2c-b1a4-4c4b631f0153 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.014358] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance cdca51b4-b059-48b6-ae81-ced1a447f10d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.027437] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1a04a0fd-11d5-4fce-ba32-d90e39a13ff9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.036932] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.050280] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3966e1f7-2107-4ddb-8077-ab37ef1a9b92 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.064049] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c57df23b-3348-41fa-a976-421f98cab569 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.076065] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c769a8f3-6f9f-4e5b-bfec-345c97da5d83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.107674] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.120121] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 4f57a0db-fe0b-4983-9e07-62485a53f918 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.132188] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3930edcb-c8ce-44f4-84ae-a2b59f99bc82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.145614] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.157953] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1192.158365] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1192.158629] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1192.178739] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Refreshing inventories for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1192.194524] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Updating ProviderTree inventory for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1192.194524] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Updating inventory in ProviderTree for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1192.207871] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Refreshing aggregate associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, aggregates: None {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1192.230281] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Refreshing trait associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1192.623028] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f343e0ed-ee6e-450e-b648-911762f4c628 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1192.631403] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64f2c835-4e66-4321-83b7-f942cb734ec3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1192.663591] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7633b779-ce94-41a4-9c4d-abfc28260c87 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1192.671267] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-787b1a57-a5a1-457c-b2b9-8c0dcf0e5762 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1192.684462] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1192.693245] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1192.712936] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1192.712936] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.942s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1193.714462] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquiring lock "74ea66f0-391c-437b-8aee-f784528d7963" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1193.714764] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "74ea66f0-391c-437b-8aee-f784528d7963" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1194.687656] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1194.688568] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1194.688568] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1194.688568] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1195.660114] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1203.436632] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9969fe74-4992-49fc-a69d-23e6586f88e7 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] Acquiring lock "70ce9280-fb86-4e6a-a824-a174d44b4ec4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1203.436921] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9969fe74-4992-49fc-a69d-23e6586f88e7 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] Lock "70ce9280-fb86-4e6a-a824-a174d44b4ec4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1215.495472] env[67169]: WARNING oslo_vmware.rw_handles [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1215.495472] env[67169]: ERROR oslo_vmware.rw_handles [ 1215.496098] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/91409eea-5e08-4f1f-8129-3e152c269647/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1215.498100] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1215.498383] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Copying Virtual Disk [datastore2] vmware_temp/91409eea-5e08-4f1f-8129-3e152c269647/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/91409eea-5e08-4f1f-8129-3e152c269647/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1215.498680] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-80532b69-67ae-4689-94f5-93c68f4e2d69 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.506544] env[67169]: DEBUG oslo_vmware.api [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Waiting for the task: (returnval){ [ 1215.506544] env[67169]: value = "task-2819171" [ 1215.506544] env[67169]: _type = "Task" [ 1215.506544] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1215.514824] env[67169]: DEBUG oslo_vmware.api [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Task: {'id': task-2819171, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1216.019058] env[67169]: DEBUG oslo_vmware.exceptions [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1216.019058] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1216.019342] env[67169]: ERROR nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1216.019342] env[67169]: Faults: ['InvalidArgument'] [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Traceback (most recent call last): [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] yield resources [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self.driver.spawn(context, instance, image_meta, [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._fetch_image_if_missing(context, vi) [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] image_cache(vi, tmp_image_ds_loc) [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] vm_util.copy_virtual_disk( [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] session._wait_for_task(vmdk_copy_task) [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.wait_for_task(task_ref) [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return evt.wait() [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] result = hub.switch() [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.greenlet.switch() [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self.f(*self.args, **self.kw) [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] raise exceptions.translate_fault(task_info.error) [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Faults: ['InvalidArgument'] [ 1216.019342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1216.020307] env[67169]: INFO nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Terminating instance [ 1216.021178] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1216.021398] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1216.021721] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0a96c98f-8282-4651-a4f7-316d5f5a62b6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.024694] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1216.024892] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1216.025625] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddd019cc-3e3c-41db-94cc-78ba90ad50a8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.029068] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1216.029236] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1216.030175] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ae034f5d-1ae0-418a-9199-0f60dbeb22f8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.033967] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1216.034443] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4e81776f-9193-414e-8008-851c0f443087 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.036618] env[67169]: DEBUG oslo_vmware.api [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Waiting for the task: (returnval){ [ 1216.036618] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a2d722-0137-8fcc-0818-02a74f6689d1" [ 1216.036618] env[67169]: _type = "Task" [ 1216.036618] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1216.044523] env[67169]: DEBUG oslo_vmware.api [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a2d722-0137-8fcc-0818-02a74f6689d1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1216.103902] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1216.104234] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1216.104413] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Deleting the datastore file [datastore2] 7a42aeb9-0518-448d-a3a6-8e68d6497922 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1216.104750] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ee7cbd01-a816-43a0-979c-8849140546ed {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.110221] env[67169]: DEBUG oslo_vmware.api [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Waiting for the task: (returnval){ [ 1216.110221] env[67169]: value = "task-2819173" [ 1216.110221] env[67169]: _type = "Task" [ 1216.110221] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1216.117676] env[67169]: DEBUG oslo_vmware.api [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Task: {'id': task-2819173, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1216.547535] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1216.547826] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Creating directory with path [datastore2] vmware_temp/1d947596-3eb6-476a-bb6b-5e4df5b388f9/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1216.549706] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f1b3550d-721a-47c8-b00e-7612a21d339c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.559853] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Created directory with path [datastore2] vmware_temp/1d947596-3eb6-476a-bb6b-5e4df5b388f9/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1216.560062] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Fetch image to [datastore2] vmware_temp/1d947596-3eb6-476a-bb6b-5e4df5b388f9/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1216.560236] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/1d947596-3eb6-476a-bb6b-5e4df5b388f9/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1216.560998] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d07dc058-44f5-458f-95dd-80c981ab1491 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.567512] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f3a6f7e-63ff-44fc-af32-337b39c265b7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.576657] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67dfe7e1-92c8-4239-901f-905ca06de8bc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.607121] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-349e1b54-f6b3-462f-b1b4-3bab030da00a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.615600] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1f2698bc-cc87-4acc-9d34-2fda3b949b77 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.619859] env[67169]: DEBUG oslo_vmware.api [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Task: {'id': task-2819173, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.061833} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1216.620390] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1216.620577] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1216.620801] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1216.620912] env[67169]: INFO nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1216.622897] env[67169]: DEBUG nova.compute.claims [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1216.623109] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1216.623334] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1216.644597] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1216.849880] env[67169]: DEBUG oslo_vmware.rw_handles [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d947596-3eb6-476a-bb6b-5e4df5b388f9/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1216.910520] env[67169]: DEBUG oslo_vmware.rw_handles [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1216.912452] env[67169]: DEBUG oslo_vmware.rw_handles [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d947596-3eb6-476a-bb6b-5e4df5b388f9/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1216.975970] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35d2c060-b2e5-461f-bd95-b49e2d6a705c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.983396] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd962dd2-2c35-4ae6-aff5-210117614f8a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.013651] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02ebdd99-33ea-4178-9d12-c6b439fc771f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.020938] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93c19918-6fd1-4a9e-a5cf-37d9d55bd47e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.037327] env[67169]: DEBUG nova.compute.provider_tree [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1217.046357] env[67169]: DEBUG nova.scheduler.client.report [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1217.061249] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.438s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.061775] env[67169]: ERROR nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1217.061775] env[67169]: Faults: ['InvalidArgument'] [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Traceback (most recent call last): [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self.driver.spawn(context, instance, image_meta, [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._fetch_image_if_missing(context, vi) [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] image_cache(vi, tmp_image_ds_loc) [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] vm_util.copy_virtual_disk( [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] session._wait_for_task(vmdk_copy_task) [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.wait_for_task(task_ref) [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return evt.wait() [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] result = hub.switch() [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.greenlet.switch() [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self.f(*self.args, **self.kw) [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] raise exceptions.translate_fault(task_info.error) [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Faults: ['InvalidArgument'] [ 1217.061775] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.062814] env[67169]: DEBUG nova.compute.utils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1217.064187] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Build of instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 was re-scheduled: A specified parameter was not correct: fileType [ 1217.064187] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1217.064580] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1217.064769] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1217.064925] env[67169]: DEBUG nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1217.065099] env[67169]: DEBUG nova.network.neutron [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1217.283735] env[67169]: DEBUG neutronclient.v2_0.client [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67169) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1217.286183] env[67169]: ERROR nova.compute.manager [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Traceback (most recent call last): [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self.driver.spawn(context, instance, image_meta, [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._fetch_image_if_missing(context, vi) [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] image_cache(vi, tmp_image_ds_loc) [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] vm_util.copy_virtual_disk( [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] session._wait_for_task(vmdk_copy_task) [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.wait_for_task(task_ref) [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return evt.wait() [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] result = hub.switch() [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.greenlet.switch() [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self.f(*self.args, **self.kw) [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] raise exceptions.translate_fault(task_info.error) [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Faults: ['InvalidArgument'] [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] During handling of the above exception, another exception occurred: [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Traceback (most recent call last): [ 1217.286183] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._build_and_run_instance(context, instance, image, [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] raise exception.RescheduledException( [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] nova.exception.RescheduledException: Build of instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 was re-scheduled: A specified parameter was not correct: fileType [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Faults: ['InvalidArgument'] [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] During handling of the above exception, another exception occurred: [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Traceback (most recent call last): [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] exception_handler_v20(status_code, error_body) [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] raise client_exc(message=error_message, [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Neutron server returns request_ids: ['req-37af404f-9687-4c9a-86e9-622d48234e02'] [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] During handling of the above exception, another exception occurred: [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Traceback (most recent call last): [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._deallocate_network(context, instance, requested_networks) [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self.network_api.deallocate_for_instance( [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] data = neutron.list_ports(**search_opts) [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.list('ports', self.ports_path, retrieve_all, [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.287277] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] for r in self._pagination(collection, path, **params): [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] res = self.get(path, params=params) [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.retry_request("GET", action, body=body, [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.do_request(method, action, body=body, [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._handle_fault_response(status_code, replybody, resp) [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] raise exception.Unauthorized() [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] nova.exception.Unauthorized: Not authorized. [ 1217.288342] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.344640] env[67169]: INFO nova.scheduler.client.report [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Deleted allocations for instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 [ 1217.365026] env[67169]: DEBUG oslo_concurrency.lockutils [None req-97f5b735-959c-46f5-9133-f471ad4178d8 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 619.071s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.366193] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 419.534s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.366449] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Acquiring lock "7a42aeb9-0518-448d-a3a6-8e68d6497922-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1217.366625] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.366799] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.371732] env[67169]: INFO nova.compute.manager [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Terminating instance [ 1217.373492] env[67169]: DEBUG nova.compute.manager [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1217.373684] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1217.373959] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a455d831-caf5-4f26-967f-7d0058b70b68 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.378900] env[67169]: DEBUG nova.compute.manager [None req-f7e059ad-cf85-4f1e-9612-3b3d088114e3 tempest-ServerAddressesNegativeTestJSON-244048613 tempest-ServerAddressesNegativeTestJSON-244048613-project-member] [instance: ca657a42-3745-46e1-8fc9-61de31f661d8] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.385438] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3f9dead-a25c-4e40-94cb-d13f7a5273d1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.404585] env[67169]: DEBUG nova.compute.manager [None req-f7e059ad-cf85-4f1e-9612-3b3d088114e3 tempest-ServerAddressesNegativeTestJSON-244048613 tempest-ServerAddressesNegativeTestJSON-244048613-project-member] [instance: ca657a42-3745-46e1-8fc9-61de31f661d8] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1217.418501] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7a42aeb9-0518-448d-a3a6-8e68d6497922 could not be found. [ 1217.419711] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1217.419711] env[67169]: INFO nova.compute.manager [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1217.419711] env[67169]: DEBUG oslo.service.loopingcall [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1217.419711] env[67169]: DEBUG nova.compute.manager [-] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1217.419711] env[67169]: DEBUG nova.network.neutron [-] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1217.436296] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f7e059ad-cf85-4f1e-9612-3b3d088114e3 tempest-ServerAddressesNegativeTestJSON-244048613 tempest-ServerAddressesNegativeTestJSON-244048613-project-member] Lock "ca657a42-3745-46e1-8fc9-61de31f661d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.604s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.448871] env[67169]: DEBUG nova.compute.manager [None req-5678a70a-1a33-4b69-862c-ad483d1e90f6 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 3f90c9a4-650d-4280-b155-1315d2f0f281] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.477719] env[67169]: DEBUG nova.compute.manager [None req-5678a70a-1a33-4b69-862c-ad483d1e90f6 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 3f90c9a4-650d-4280-b155-1315d2f0f281] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1217.502190] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5678a70a-1a33-4b69-862c-ad483d1e90f6 tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "3f90c9a4-650d-4280-b155-1315d2f0f281" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.474s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.515533] env[67169]: DEBUG nova.compute.manager [None req-a3591b6f-cb28-42b6-8fac-da83059b80ed tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] [instance: d964ad35-8d3f-45f3-b799-aebddf295012] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.543566] env[67169]: DEBUG nova.compute.manager [None req-a3591b6f-cb28-42b6-8fac-da83059b80ed tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] [instance: d964ad35-8d3f-45f3-b799-aebddf295012] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1217.551144] env[67169]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67169) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1217.551394] env[67169]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-effbce55-2bf0-4f43-9d7e-1c5de499fbae'] [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1217.552146] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1217.553637] env[67169]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1217.553637] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1217.553637] env[67169]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1217.553637] env[67169]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1217.553637] env[67169]: ERROR oslo.service.loopingcall [ 1217.553637] env[67169]: ERROR nova.compute.manager [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1217.568903] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a3591b6f-cb28-42b6-8fac-da83059b80ed tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] Lock "d964ad35-8d3f-45f3-b799-aebddf295012" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.571s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.583440] env[67169]: ERROR nova.compute.manager [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Traceback (most recent call last): [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] exception_handler_v20(status_code, error_body) [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] raise client_exc(message=error_message, [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Neutron server returns request_ids: ['req-effbce55-2bf0-4f43-9d7e-1c5de499fbae'] [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] During handling of the above exception, another exception occurred: [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Traceback (most recent call last): [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._delete_instance(context, instance, bdms) [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._shutdown_instance(context, instance, bdms) [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._try_deallocate_network(context, instance, requested_networks) [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] with excutils.save_and_reraise_exception(): [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self.force_reraise() [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] raise self.value [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] _deallocate_network_with_retries() [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return evt.wait() [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1217.583440] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] result = hub.switch() [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.greenlet.switch() [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] result = func(*self.args, **self.kw) [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] result = f(*args, **kwargs) [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._deallocate_network( [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self.network_api.deallocate_for_instance( [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] data = neutron.list_ports(**search_opts) [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.list('ports', self.ports_path, retrieve_all, [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] for r in self._pagination(collection, path, **params): [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] res = self.get(path, params=params) [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.retry_request("GET", action, body=body, [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] return self.do_request(method, action, body=body, [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] ret = obj(*args, **kwargs) [ 1217.584542] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1217.585571] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] self._handle_fault_response(status_code, replybody, resp) [ 1217.585571] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1217.585571] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1217.585571] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1217.585571] env[67169]: ERROR nova.compute.manager [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] [ 1217.589019] env[67169]: DEBUG nova.compute.manager [None req-90998c77-a1b7-454e-8f72-88bd859f39b9 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] [instance: 0b78afae-71e9-4ba9-903a-03c8a98cd91e] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.612870] env[67169]: DEBUG nova.compute.manager [None req-90998c77-a1b7-454e-8f72-88bd859f39b9 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] [instance: 0b78afae-71e9-4ba9-903a-03c8a98cd91e] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1217.614540] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.248s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.616365] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 31.917s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.616554] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] During sync_power_state the instance has a pending task (deleting). Skip. [ 1217.616730] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "7a42aeb9-0518-448d-a3a6-8e68d6497922" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.639219] env[67169]: DEBUG oslo_concurrency.lockutils [None req-90998c77-a1b7-454e-8f72-88bd859f39b9 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] Lock "0b78afae-71e9-4ba9-903a-03c8a98cd91e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.897s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.650975] env[67169]: DEBUG nova.compute.manager [None req-18d319a2-8ae3-447e-a465-6e3b089fcdc4 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] [instance: 54b1337f-4ac8-4718-b273-2f078782b491] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.674586] env[67169]: DEBUG nova.compute.manager [None req-18d319a2-8ae3-447e-a465-6e3b089fcdc4 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] [instance: 54b1337f-4ac8-4718-b273-2f078782b491] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1217.676336] env[67169]: INFO nova.compute.manager [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] [instance: 7a42aeb9-0518-448d-a3a6-8e68d6497922] Successfully reverted task state from None on failure for instance. [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server [None req-ffbf85f6-8c28-467c-87fa-6e8c39cd384e tempest-MigrationsAdminTest-2069277652 tempest-MigrationsAdminTest-2069277652-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-effbce55-2bf0-4f43-9d7e-1c5de499fbae'] [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1217.680800] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1217.682300] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1217.683922] env[67169]: ERROR oslo_messaging.rpc.server [ 1217.699592] env[67169]: DEBUG oslo_concurrency.lockutils [None req-18d319a2-8ae3-447e-a465-6e3b089fcdc4 tempest-ListServerFiltersTestJSON-1026077778 tempest-ListServerFiltersTestJSON-1026077778-project-member] Lock "54b1337f-4ac8-4718-b273-2f078782b491" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.036s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.710715] env[67169]: DEBUG nova.compute.manager [None req-e7183d51-1057-45cf-8e24-24eae5cf8020 tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] [instance: bf6857fb-2088-4e2c-b1a4-4c4b631f0153] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.736846] env[67169]: DEBUG nova.compute.manager [None req-e7183d51-1057-45cf-8e24-24eae5cf8020 tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] [instance: bf6857fb-2088-4e2c-b1a4-4c4b631f0153] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1217.761786] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e7183d51-1057-45cf-8e24-24eae5cf8020 tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] Lock "bf6857fb-2088-4e2c-b1a4-4c4b631f0153" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.466s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.770788] env[67169]: DEBUG nova.compute.manager [None req-85a1e9a2-871f-47fc-bc64-588687eba07f tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] [instance: cdca51b4-b059-48b6-ae81-ced1a447f10d] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.794918] env[67169]: DEBUG nova.compute.manager [None req-85a1e9a2-871f-47fc-bc64-588687eba07f tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] [instance: cdca51b4-b059-48b6-ae81-ced1a447f10d] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1217.829965] env[67169]: DEBUG oslo_concurrency.lockutils [None req-85a1e9a2-871f-47fc-bc64-588687eba07f tempest-ServerRescueNegativeTestJSON-303097852 tempest-ServerRescueNegativeTestJSON-303097852-project-member] Lock "cdca51b4-b059-48b6-ae81-ced1a447f10d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.287s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.839304] env[67169]: DEBUG nova.compute.manager [None req-c978442b-871a-4422-ae40-03e18147f708 tempest-ServerRescueTestJSON-1216610948 tempest-ServerRescueTestJSON-1216610948-project-member] [instance: 1a04a0fd-11d5-4fce-ba32-d90e39a13ff9] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.864707] env[67169]: DEBUG nova.compute.manager [None req-c978442b-871a-4422-ae40-03e18147f708 tempest-ServerRescueTestJSON-1216610948 tempest-ServerRescueTestJSON-1216610948-project-member] [instance: 1a04a0fd-11d5-4fce-ba32-d90e39a13ff9] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1217.885189] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c978442b-871a-4422-ae40-03e18147f708 tempest-ServerRescueTestJSON-1216610948 tempest-ServerRescueTestJSON-1216610948-project-member] Lock "1a04a0fd-11d5-4fce-ba32-d90e39a13ff9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.589s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.893806] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.945877] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1217.946159] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.947631] env[67169]: INFO nova.compute.claims [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1218.246651] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcfd308b-7354-4961-8e1f-8ddaa849ab23 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.255668] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6618e567-3639-49eb-aa69-23187dff6aa9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.289882] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7074950-92e4-47aa-b957-cbfe53414351 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.297614] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5654bbad-d341-44cc-9d25-d32961e466c4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.315030] env[67169]: DEBUG nova.compute.provider_tree [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1218.328848] env[67169]: DEBUG nova.scheduler.client.report [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1218.352221] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.406s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1218.352738] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1218.396673] env[67169]: DEBUG nova.compute.utils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1218.398015] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1218.398382] env[67169]: DEBUG nova.network.neutron [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1218.407118] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1218.480799] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1218.493690] env[67169]: DEBUG nova.policy [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d789ec14c2b4d62be952753fb47f0f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '00d358bc61014b5cb3ddcdab7785e7e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1218.523364] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1218.523364] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1218.523364] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1218.523364] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1218.523364] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1218.523364] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1218.524735] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1218.525055] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1218.525710] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1218.526032] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1218.526344] env[67169]: DEBUG nova.virt.hardware [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1218.529028] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82127e5b-88a4-4cf9-8c0c-d1b51554fb17 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.537866] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-441fe2c8-a3cf-448f-8d07-6ae07ecc02d4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.961491] env[67169]: DEBUG nova.network.neutron [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Successfully created port: 6dcfd6f5-386c-4f9e-95f6-e618021158bb {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1220.046178] env[67169]: DEBUG nova.compute.manager [req-261b9e29-430b-422f-b272-b81702c527ae req-e8b96616-2b10-4d5b-a750-124fff9ce6a6 service nova] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Received event network-vif-plugged-6dcfd6f5-386c-4f9e-95f6-e618021158bb {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1220.046437] env[67169]: DEBUG oslo_concurrency.lockutils [req-261b9e29-430b-422f-b272-b81702c527ae req-e8b96616-2b10-4d5b-a750-124fff9ce6a6 service nova] Acquiring lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1220.046614] env[67169]: DEBUG oslo_concurrency.lockutils [req-261b9e29-430b-422f-b272-b81702c527ae req-e8b96616-2b10-4d5b-a750-124fff9ce6a6 service nova] Lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1220.046908] env[67169]: DEBUG oslo_concurrency.lockutils [req-261b9e29-430b-422f-b272-b81702c527ae req-e8b96616-2b10-4d5b-a750-124fff9ce6a6 service nova] Lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1220.046970] env[67169]: DEBUG nova.compute.manager [req-261b9e29-430b-422f-b272-b81702c527ae req-e8b96616-2b10-4d5b-a750-124fff9ce6a6 service nova] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] No waiting events found dispatching network-vif-plugged-6dcfd6f5-386c-4f9e-95f6-e618021158bb {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1220.047139] env[67169]: WARNING nova.compute.manager [req-261b9e29-430b-422f-b272-b81702c527ae req-e8b96616-2b10-4d5b-a750-124fff9ce6a6 service nova] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Received unexpected event network-vif-plugged-6dcfd6f5-386c-4f9e-95f6-e618021158bb for instance with vm_state building and task_state spawning. [ 1220.062700] env[67169]: DEBUG nova.network.neutron [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Successfully updated port: 6dcfd6f5-386c-4f9e-95f6-e618021158bb {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1220.077125] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "refresh_cache-37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1220.077304] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "refresh_cache-37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1220.077573] env[67169]: DEBUG nova.network.neutron [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1220.111897] env[67169]: DEBUG nova.network.neutron [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1220.271640] env[67169]: DEBUG nova.network.neutron [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Updating instance_info_cache with network_info: [{"id": "6dcfd6f5-386c-4f9e-95f6-e618021158bb", "address": "fa:16:3e:96:26:a6", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6dcfd6f5-38", "ovs_interfaceid": "6dcfd6f5-386c-4f9e-95f6-e618021158bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1220.306211] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "refresh_cache-37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1220.306626] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Instance network_info: |[{"id": "6dcfd6f5-386c-4f9e-95f6-e618021158bb", "address": "fa:16:3e:96:26:a6", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6dcfd6f5-38", "ovs_interfaceid": "6dcfd6f5-386c-4f9e-95f6-e618021158bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1220.307618] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:96:26:a6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '20e3f794-c7a3-4696-9488-ecf34c570ef9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6dcfd6f5-386c-4f9e-95f6-e618021158bb', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1220.315896] env[67169]: DEBUG oslo.service.loopingcall [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1220.316320] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1220.316558] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6e4b9033-a5e4-4b8a-ae1c-853e34835aeb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1220.336973] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1220.336973] env[67169]: value = "task-2819174" [ 1220.336973] env[67169]: _type = "Task" [ 1220.336973] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1220.345194] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819174, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1220.849405] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819174, 'name': CreateVM_Task, 'duration_secs': 0.33378} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1220.849628] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1220.857876] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1220.858102] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1220.858425] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1220.858754] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fd6c72a1-1641-4b5c-9ec3-d73e5734f9a1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1220.863478] env[67169]: DEBUG oslo_vmware.api [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 1220.863478] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a56178-ce1d-7b3a-119b-e5fbc16d558d" [ 1220.863478] env[67169]: _type = "Task" [ 1220.863478] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1220.871402] env[67169]: DEBUG oslo_vmware.api [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a56178-ce1d-7b3a-119b-e5fbc16d558d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1221.017910] env[67169]: DEBUG oslo_concurrency.lockutils [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1221.374357] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1221.374734] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1221.374846] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1222.070613] env[67169]: DEBUG nova.compute.manager [req-db10290e-8c0d-4087-b46e-18d614512706 req-b2cbc133-f44b-4613-8f9d-b73196851c8f service nova] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Received event network-changed-6dcfd6f5-386c-4f9e-95f6-e618021158bb {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1222.070797] env[67169]: DEBUG nova.compute.manager [req-db10290e-8c0d-4087-b46e-18d614512706 req-b2cbc133-f44b-4613-8f9d-b73196851c8f service nova] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Refreshing instance network info cache due to event network-changed-6dcfd6f5-386c-4f9e-95f6-e618021158bb. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1222.070971] env[67169]: DEBUG oslo_concurrency.lockutils [req-db10290e-8c0d-4087-b46e-18d614512706 req-b2cbc133-f44b-4613-8f9d-b73196851c8f service nova] Acquiring lock "refresh_cache-37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1222.071162] env[67169]: DEBUG oslo_concurrency.lockutils [req-db10290e-8c0d-4087-b46e-18d614512706 req-b2cbc133-f44b-4613-8f9d-b73196851c8f service nova] Acquired lock "refresh_cache-37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1222.071350] env[67169]: DEBUG nova.network.neutron [req-db10290e-8c0d-4087-b46e-18d614512706 req-b2cbc133-f44b-4613-8f9d-b73196851c8f service nova] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Refreshing network info cache for port 6dcfd6f5-386c-4f9e-95f6-e618021158bb {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1222.410212] env[67169]: DEBUG nova.network.neutron [req-db10290e-8c0d-4087-b46e-18d614512706 req-b2cbc133-f44b-4613-8f9d-b73196851c8f service nova] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Updated VIF entry in instance network info cache for port 6dcfd6f5-386c-4f9e-95f6-e618021158bb. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1222.410595] env[67169]: DEBUG nova.network.neutron [req-db10290e-8c0d-4087-b46e-18d614512706 req-b2cbc133-f44b-4613-8f9d-b73196851c8f service nova] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Updating instance_info_cache with network_info: [{"id": "6dcfd6f5-386c-4f9e-95f6-e618021158bb", "address": "fa:16:3e:96:26:a6", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6dcfd6f5-38", "ovs_interfaceid": "6dcfd6f5-386c-4f9e-95f6-e618021158bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1222.420051] env[67169]: DEBUG oslo_concurrency.lockutils [req-db10290e-8c0d-4087-b46e-18d614512706 req-b2cbc133-f44b-4613-8f9d-b73196851c8f service nova] Releasing lock "refresh_cache-37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1245.658945] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1252.660525] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1252.660525] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1252.660525] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1252.690863] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquiring lock "7b7c8f84-c2d4-442e-93d3-60124767d096" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1252.690863] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1252.698027] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.698027] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.698027] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.698027] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.698027] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.700834] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.700834] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.700834] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.700834] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.700834] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1252.700834] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1252.700834] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1252.700834] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1252.700834] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1252.719361] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1252.720115] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1252.720115] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1252.720115] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1252.720965] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-515f4850-630c-42e8-b7d2-557f60702a63 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.735958] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebe9aa79-3346-485e-ac18-9640b409e184 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.751351] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff2ce948-90d8-4648-a977-983892b480f3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.760018] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2252fb9-dc84-4f27-9a8c-e83bffec6c19 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.791112] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181045MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1252.791294] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1252.791556] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1252.885161] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.885332] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.885462] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.885646] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.885711] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.885829] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.885947] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.886076] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.886193] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.886306] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1252.903318] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1252.916571] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 4f57a0db-fe0b-4983-9e07-62485a53f918 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1252.933276] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3930edcb-c8ce-44f4-84ae-a2b59f99bc82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1252.946814] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1252.960128] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1252.977595] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 74ea66f0-391c-437b-8aee-f784528d7963 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1252.991483] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 70ce9280-fb86-4e6a-a824-a174d44b4ec4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1253.002069] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1253.002307] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1253.002462] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1253.182465] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f38336b9-8a52-414c-9609-3b46d9804727 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] Acquiring lock "cbf88ee7-b392-46d5-8645-2b3bea0a53d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1253.182821] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f38336b9-8a52-414c-9609-3b46d9804727 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] Lock "cbf88ee7-b392-46d5-8645-2b3bea0a53d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1253.238992] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e9971b6-d63d-406d-844b-1778955d51b7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.246934] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31c2b0a9-bbc6-4107-8da7-efb9d998c28f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.277428] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23707063-1f93-44e6-b9eb-8c2b01863b50 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.284954] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6442072c-6cff-49f2-b200-30647690af24 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.298036] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1253.307027] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1253.321058] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1253.321261] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.530s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1254.315654] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1254.658312] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1254.658562] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1255.658857] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1255.659230] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1256.659559] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1265.574845] env[67169]: WARNING oslo_vmware.rw_handles [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1265.574845] env[67169]: ERROR oslo_vmware.rw_handles [ 1265.575838] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/1d947596-3eb6-476a-bb6b-5e4df5b388f9/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1265.577325] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1265.577604] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Copying Virtual Disk [datastore2] vmware_temp/1d947596-3eb6-476a-bb6b-5e4df5b388f9/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/1d947596-3eb6-476a-bb6b-5e4df5b388f9/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1265.577938] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f842c668-42f2-4368-b146-17284cf6eb1e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.586855] env[67169]: DEBUG oslo_vmware.api [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Waiting for the task: (returnval){ [ 1265.586855] env[67169]: value = "task-2819175" [ 1265.586855] env[67169]: _type = "Task" [ 1265.586855] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1265.595296] env[67169]: DEBUG oslo_vmware.api [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Task: {'id': task-2819175, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1266.099574] env[67169]: DEBUG oslo_vmware.exceptions [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1266.099886] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1266.100482] env[67169]: ERROR nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1266.100482] env[67169]: Faults: ['InvalidArgument'] [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Traceback (most recent call last): [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] yield resources [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] self.driver.spawn(context, instance, image_meta, [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] self._fetch_image_if_missing(context, vi) [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] image_cache(vi, tmp_image_ds_loc) [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] vm_util.copy_virtual_disk( [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] session._wait_for_task(vmdk_copy_task) [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] return self.wait_for_task(task_ref) [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] return evt.wait() [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] result = hub.switch() [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] return self.greenlet.switch() [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] self.f(*self.args, **self.kw) [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] raise exceptions.translate_fault(task_info.error) [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Faults: ['InvalidArgument'] [ 1266.100482] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] [ 1266.101647] env[67169]: INFO nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Terminating instance [ 1266.102444] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1266.102650] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1266.102890] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cb7e3bd0-6180-4902-8781-65cb9f1bf397 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.105143] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1266.105341] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1266.106084] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aebb118a-bb8f-47f6-8bc7-bb50f226aefc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.113071] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1266.113226] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b712b134-cefc-44f9-bfbc-1a2c715caea7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.115459] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1266.115628] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1266.116565] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-52145700-c1f2-43ce-b56a-b0d176d66ac8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.121419] env[67169]: DEBUG oslo_vmware.api [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 1266.121419] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52b72f29-f12b-1e92-c20c-192049ce9335" [ 1266.121419] env[67169]: _type = "Task" [ 1266.121419] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1266.129668] env[67169]: DEBUG oslo_vmware.api [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52b72f29-f12b-1e92-c20c-192049ce9335, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1266.185067] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1266.185307] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1266.185485] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Deleting the datastore file [datastore2] 43b73a7c-eda8-4239-885f-d4fb8fa6f28a {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1266.185757] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-435d99ae-897f-404b-b8b5-f7e35d119846 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.191981] env[67169]: DEBUG oslo_vmware.api [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Waiting for the task: (returnval){ [ 1266.191981] env[67169]: value = "task-2819177" [ 1266.191981] env[67169]: _type = "Task" [ 1266.191981] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1266.199621] env[67169]: DEBUG oslo_vmware.api [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Task: {'id': task-2819177, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1266.632771] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1266.633072] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating directory with path [datastore2] vmware_temp/119d1432-bc42-4591-8720-ea8e63065f7b/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1266.633318] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4f6a7d85-b8ed-46ce-953a-3e2098f68759 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.644370] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created directory with path [datastore2] vmware_temp/119d1432-bc42-4591-8720-ea8e63065f7b/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1266.644581] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Fetch image to [datastore2] vmware_temp/119d1432-bc42-4591-8720-ea8e63065f7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1266.644768] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/119d1432-bc42-4591-8720-ea8e63065f7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1266.645502] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e33f060-3ae1-4908-af58-a6feb75a1198 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.652338] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ac0d6c8-8f02-4284-94b2-c1cdea5fd5d0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.661499] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b38d4d97-7668-4010-ba35-15b51d334324 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.696046] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ba61242-e607-4b18-ae96-33c6f1c04409 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.702985] env[67169]: DEBUG oslo_vmware.api [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Task: {'id': task-2819177, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079024} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1266.704513] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1266.704699] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1266.704871] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1266.705054] env[67169]: INFO nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1266.706808] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5285ca91-37c5-45cf-b948-f4c840ac7bdd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.708717] env[67169]: DEBUG nova.compute.claims [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1266.708895] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1266.709120] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1266.730996] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1266.788378] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/119d1432-bc42-4591-8720-ea8e63065f7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1266.850320] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1266.850518] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/119d1432-bc42-4591-8720-ea8e63065f7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1267.125631] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f08b6d08-6fd1-4d23-acef-ee49a1f26ff7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.134061] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6afdb01f-2671-44aa-a507-04579dc17947 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.162694] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a77ddde2-0eeb-4c6a-866f-a68e48d25666 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.170219] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c2de2be-3c62-4264-94ba-768409d1dda6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.183238] env[67169]: DEBUG nova.compute.provider_tree [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1267.191546] env[67169]: DEBUG nova.scheduler.client.report [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1267.207598] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.497s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.207598] env[67169]: ERROR nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1267.207598] env[67169]: Faults: ['InvalidArgument'] [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Traceback (most recent call last): [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] self.driver.spawn(context, instance, image_meta, [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] self._fetch_image_if_missing(context, vi) [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] image_cache(vi, tmp_image_ds_loc) [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] vm_util.copy_virtual_disk( [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] session._wait_for_task(vmdk_copy_task) [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] return self.wait_for_task(task_ref) [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] return evt.wait() [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] result = hub.switch() [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] return self.greenlet.switch() [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] self.f(*self.args, **self.kw) [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] raise exceptions.translate_fault(task_info.error) [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Faults: ['InvalidArgument'] [ 1267.207598] env[67169]: ERROR nova.compute.manager [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] [ 1267.208786] env[67169]: DEBUG nova.compute.utils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1267.208786] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Build of instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a was re-scheduled: A specified parameter was not correct: fileType [ 1267.208786] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1267.208978] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1267.209177] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1267.209348] env[67169]: DEBUG nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1267.209509] env[67169]: DEBUG nova.network.neutron [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1267.660294] env[67169]: DEBUG nova.network.neutron [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1267.675326] env[67169]: INFO nova.compute.manager [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Took 0.47 seconds to deallocate network for instance. [ 1267.786716] env[67169]: INFO nova.scheduler.client.report [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Deleted allocations for instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a [ 1267.816602] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8ae57ef2-fe50-47ae-b868-f7b19963d43c tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 669.101s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.818633] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 469.834s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1267.818633] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Acquiring lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1267.818633] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1267.818633] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.823040] env[67169]: INFO nova.compute.manager [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Terminating instance [ 1267.823040] env[67169]: DEBUG nova.compute.manager [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1267.823201] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1267.823441] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e10e92e5-f961-4f80-a6c4-9c442a50a264 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.835224] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfdd2cf1-1248-4408-844e-e2f4b9a57854 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.851205] env[67169]: DEBUG nova.compute.manager [None req-57d7723f-f308-4517-8973-d308d992f41e tempest-ServersTestBootFromVolume-711128650 tempest-ServersTestBootFromVolume-711128650-project-member] [instance: 3966e1f7-2107-4ddb-8077-ab37ef1a9b92] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1267.875228] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 43b73a7c-eda8-4239-885f-d4fb8fa6f28a could not be found. [ 1267.875438] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1267.875616] env[67169]: INFO nova.compute.manager [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1267.875870] env[67169]: DEBUG oslo.service.loopingcall [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1267.876139] env[67169]: DEBUG nova.compute.manager [-] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1267.876239] env[67169]: DEBUG nova.network.neutron [-] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1267.888399] env[67169]: DEBUG nova.compute.manager [None req-57d7723f-f308-4517-8973-d308d992f41e tempest-ServersTestBootFromVolume-711128650 tempest-ServersTestBootFromVolume-711128650-project-member] [instance: 3966e1f7-2107-4ddb-8077-ab37ef1a9b92] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1267.912852] env[67169]: DEBUG oslo_concurrency.lockutils [None req-57d7723f-f308-4517-8973-d308d992f41e tempest-ServersTestBootFromVolume-711128650 tempest-ServersTestBootFromVolume-711128650-project-member] Lock "3966e1f7-2107-4ddb-8077-ab37ef1a9b92" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.965s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.915413] env[67169]: DEBUG nova.network.neutron [-] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1267.922578] env[67169]: INFO nova.compute.manager [-] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] Took 0.05 seconds to deallocate network for instance. [ 1267.927670] env[67169]: DEBUG nova.compute.manager [None req-f4774cbe-50db-4e49-ba06-358bb98216ff tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: c57df23b-3348-41fa-a976-421f98cab569] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1267.957021] env[67169]: DEBUG nova.compute.manager [None req-f4774cbe-50db-4e49-ba06-358bb98216ff tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: c57df23b-3348-41fa-a976-421f98cab569] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1267.979760] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f4774cbe-50db-4e49-ba06-358bb98216ff tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "c57df23b-3348-41fa-a976-421f98cab569" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.213s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.990711] env[67169]: DEBUG nova.compute.manager [None req-ef0a7f94-c032-4a1f-af92-8f4abde4bd65 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] [instance: c769a8f3-6f9f-4e5b-bfec-345c97da5d83] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1268.045478] env[67169]: DEBUG nova.compute.manager [None req-ef0a7f94-c032-4a1f-af92-8f4abde4bd65 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] [instance: c769a8f3-6f9f-4e5b-bfec-345c97da5d83] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1268.062667] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ef0a7f94-c032-4a1f-af92-8f4abde4bd65 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] Lock "c769a8f3-6f9f-4e5b-bfec-345c97da5d83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.794s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.064144] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f9efb269-f338-4e01-954f-eaef00d66828 tempest-ServersWithSpecificFlavorTestJSON-890966063 tempest-ServersWithSpecificFlavorTestJSON-890966063-project-member] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.246s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.065022] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 82.365s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.065222] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 43b73a7c-eda8-4239-885f-d4fb8fa6f28a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1268.065395] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "43b73a7c-eda8-4239-885f-d4fb8fa6f28a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.075057] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1268.127835] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.128417] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.131106] env[67169]: INFO nova.compute.claims [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1268.400566] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce96e482-1589-4f6e-ae9b-dcc714f3c509 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.408559] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dc14fce-09f7-4b5b-9ad4-81ff8398901a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.438051] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a836d68d-a213-4f70-8d90-c9cafad0720b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.445726] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deee47df-a2e8-4279-9c33-a5db1aeec4cd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.459092] env[67169]: DEBUG nova.compute.provider_tree [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1268.467436] env[67169]: DEBUG nova.scheduler.client.report [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1268.481941] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.354s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.482456] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1268.515714] env[67169]: DEBUG nova.compute.utils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1268.517139] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1268.517310] env[67169]: DEBUG nova.network.neutron [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1268.526686] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1268.575793] env[67169]: DEBUG nova.policy [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1cd77c2a5f07460da364f0ec256c5f1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '37492586ebba45c7893955c459766b5d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1268.596688] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1268.625839] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1268.626095] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1268.626261] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1268.626663] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1268.626663] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1268.626816] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1268.626905] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1268.627082] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1268.627255] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1268.627417] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1268.627589] env[67169]: DEBUG nova.virt.hardware [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1268.628463] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9df86904-7b63-4e42-9cdc-73df72ad896c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.637337] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a81441af-e73c-4ac9-afaa-c17e258f9db9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.129539] env[67169]: DEBUG nova.network.neutron [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Successfully created port: 9e24322c-0b24-41f7-9150-d38a51b7635c {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1269.800169] env[67169]: DEBUG nova.compute.manager [req-ce72b660-3cd3-4ab6-8218-fa84d80a17b8 req-d2c428af-46ce-492e-bdd7-eaf90aa8f2ff service nova] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Received event network-vif-plugged-9e24322c-0b24-41f7-9150-d38a51b7635c {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1269.800397] env[67169]: DEBUG oslo_concurrency.lockutils [req-ce72b660-3cd3-4ab6-8218-fa84d80a17b8 req-d2c428af-46ce-492e-bdd7-eaf90aa8f2ff service nova] Acquiring lock "7817b417-599c-4619-8bd3-28d2e8236b9f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1269.800607] env[67169]: DEBUG oslo_concurrency.lockutils [req-ce72b660-3cd3-4ab6-8218-fa84d80a17b8 req-d2c428af-46ce-492e-bdd7-eaf90aa8f2ff service nova] Lock "7817b417-599c-4619-8bd3-28d2e8236b9f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1269.800795] env[67169]: DEBUG oslo_concurrency.lockutils [req-ce72b660-3cd3-4ab6-8218-fa84d80a17b8 req-d2c428af-46ce-492e-bdd7-eaf90aa8f2ff service nova] Lock "7817b417-599c-4619-8bd3-28d2e8236b9f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1269.800937] env[67169]: DEBUG nova.compute.manager [req-ce72b660-3cd3-4ab6-8218-fa84d80a17b8 req-d2c428af-46ce-492e-bdd7-eaf90aa8f2ff service nova] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] No waiting events found dispatching network-vif-plugged-9e24322c-0b24-41f7-9150-d38a51b7635c {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1269.801248] env[67169]: WARNING nova.compute.manager [req-ce72b660-3cd3-4ab6-8218-fa84d80a17b8 req-d2c428af-46ce-492e-bdd7-eaf90aa8f2ff service nova] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Received unexpected event network-vif-plugged-9e24322c-0b24-41f7-9150-d38a51b7635c for instance with vm_state building and task_state spawning. [ 1269.942039] env[67169]: DEBUG nova.network.neutron [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Successfully updated port: 9e24322c-0b24-41f7-9150-d38a51b7635c {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1269.956348] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "refresh_cache-7817b417-599c-4619-8bd3-28d2e8236b9f" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1269.956604] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquired lock "refresh_cache-7817b417-599c-4619-8bd3-28d2e8236b9f" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1269.956691] env[67169]: DEBUG nova.network.neutron [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1270.236075] env[67169]: DEBUG nova.network.neutron [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1270.474915] env[67169]: DEBUG nova.network.neutron [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Updating instance_info_cache with network_info: [{"id": "9e24322c-0b24-41f7-9150-d38a51b7635c", "address": "fa:16:3e:43:8f:d1", "network": {"id": "b61be2af-391d-401b-8e5f-b343a30fd98f", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-771864181-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "37492586ebba45c7893955c459766b5d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56834f67-27a8-43dc-bbc6-a74aaa08959b", "external-id": "nsx-vlan-transportzone-949", "segmentation_id": 949, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9e24322c-0b", "ovs_interfaceid": "9e24322c-0b24-41f7-9150-d38a51b7635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1270.488378] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Releasing lock "refresh_cache-7817b417-599c-4619-8bd3-28d2e8236b9f" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1270.488666] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Instance network_info: |[{"id": "9e24322c-0b24-41f7-9150-d38a51b7635c", "address": "fa:16:3e:43:8f:d1", "network": {"id": "b61be2af-391d-401b-8e5f-b343a30fd98f", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-771864181-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "37492586ebba45c7893955c459766b5d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56834f67-27a8-43dc-bbc6-a74aaa08959b", "external-id": "nsx-vlan-transportzone-949", "segmentation_id": 949, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9e24322c-0b", "ovs_interfaceid": "9e24322c-0b24-41f7-9150-d38a51b7635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1270.489096] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:43:8f:d1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '56834f67-27a8-43dc-bbc6-a74aaa08959b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9e24322c-0b24-41f7-9150-d38a51b7635c', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1270.497883] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Creating folder: Project (37492586ebba45c7893955c459766b5d). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1270.498467] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6d4f9476-d80c-4589-badb-2f564bad61b8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.509695] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Created folder: Project (37492586ebba45c7893955c459766b5d) in parent group-v566843. [ 1270.509695] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Creating folder: Instances. Parent ref: group-v566918. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1270.509872] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d7075594-0196-4df1-98e7-93186cee2797 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.518874] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Created folder: Instances in parent group-v566918. [ 1270.519128] env[67169]: DEBUG oslo.service.loopingcall [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1270.519320] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1270.519844] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-990d3176-7b9e-4fdb-af42-4adcd833fb46 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.540022] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1270.540022] env[67169]: value = "task-2819180" [ 1270.540022] env[67169]: _type = "Task" [ 1270.540022] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1270.548762] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819180, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1271.053763] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819180, 'name': CreateVM_Task, 'duration_secs': 0.300503} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1271.053954] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1271.054668] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1271.054852] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1271.055197] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1271.055451] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a6740591-5aef-4d3c-a43e-874b1e0141f6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1271.060754] env[67169]: DEBUG oslo_vmware.api [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for the task: (returnval){ [ 1271.060754] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f6717d-527d-ca5a-ad73-61b1f6e56f37" [ 1271.060754] env[67169]: _type = "Task" [ 1271.060754] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1271.070037] env[67169]: DEBUG oslo_vmware.api [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f6717d-527d-ca5a-ad73-61b1f6e56f37, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1271.572846] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1271.573169] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1271.573521] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1271.813704] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquiring lock "2e156908-c313-4229-840d-13ed8e6d4074" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1271.814252] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "2e156908-c313-4229-840d-13ed8e6d4074" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1271.859569] env[67169]: DEBUG nova.compute.manager [req-a355821b-b5c0-4683-8023-3a7d1aaf6631 req-d9de6a64-8bba-4b9a-bec9-fee748d4b17c service nova] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Received event network-changed-9e24322c-0b24-41f7-9150-d38a51b7635c {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1271.859823] env[67169]: DEBUG nova.compute.manager [req-a355821b-b5c0-4683-8023-3a7d1aaf6631 req-d9de6a64-8bba-4b9a-bec9-fee748d4b17c service nova] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Refreshing instance network info cache due to event network-changed-9e24322c-0b24-41f7-9150-d38a51b7635c. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1271.860877] env[67169]: DEBUG oslo_concurrency.lockutils [req-a355821b-b5c0-4683-8023-3a7d1aaf6631 req-d9de6a64-8bba-4b9a-bec9-fee748d4b17c service nova] Acquiring lock "refresh_cache-7817b417-599c-4619-8bd3-28d2e8236b9f" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1271.860877] env[67169]: DEBUG oslo_concurrency.lockutils [req-a355821b-b5c0-4683-8023-3a7d1aaf6631 req-d9de6a64-8bba-4b9a-bec9-fee748d4b17c service nova] Acquired lock "refresh_cache-7817b417-599c-4619-8bd3-28d2e8236b9f" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1271.860877] env[67169]: DEBUG nova.network.neutron [req-a355821b-b5c0-4683-8023-3a7d1aaf6631 req-d9de6a64-8bba-4b9a-bec9-fee748d4b17c service nova] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Refreshing network info cache for port 9e24322c-0b24-41f7-9150-d38a51b7635c {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1272.298271] env[67169]: DEBUG nova.network.neutron [req-a355821b-b5c0-4683-8023-3a7d1aaf6631 req-d9de6a64-8bba-4b9a-bec9-fee748d4b17c service nova] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Updated VIF entry in instance network info cache for port 9e24322c-0b24-41f7-9150-d38a51b7635c. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1272.298653] env[67169]: DEBUG nova.network.neutron [req-a355821b-b5c0-4683-8023-3a7d1aaf6631 req-d9de6a64-8bba-4b9a-bec9-fee748d4b17c service nova] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Updating instance_info_cache with network_info: [{"id": "9e24322c-0b24-41f7-9150-d38a51b7635c", "address": "fa:16:3e:43:8f:d1", "network": {"id": "b61be2af-391d-401b-8e5f-b343a30fd98f", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-771864181-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "37492586ebba45c7893955c459766b5d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56834f67-27a8-43dc-bbc6-a74aaa08959b", "external-id": "nsx-vlan-transportzone-949", "segmentation_id": 949, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9e24322c-0b", "ovs_interfaceid": "9e24322c-0b24-41f7-9150-d38a51b7635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1272.311295] env[67169]: DEBUG oslo_concurrency.lockutils [req-a355821b-b5c0-4683-8023-3a7d1aaf6631 req-d9de6a64-8bba-4b9a-bec9-fee748d4b17c service nova] Releasing lock "refresh_cache-7817b417-599c-4619-8bd3-28d2e8236b9f" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1274.742130] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "7817b417-599c-4619-8bd3-28d2e8236b9f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1307.659693] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1313.658695] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1313.671267] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1313.671510] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1313.671686] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1313.671842] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1313.673007] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04355030-6dfb-4b78-833e-418be44c4a3f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.682153] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c3c3602-da0a-40f0-9198-72610e217992 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.695922] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3827bc04-bf6c-425e-9128-924d6496b8a4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.702279] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ead8601b-2e26-4a06-a0f6-8a50f11c4987 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.732709] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181023MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1313.732709] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1313.732709] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1313.807968] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.808158] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.808293] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.808418] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.808538] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.808658] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.808777] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.808891] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.809020] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.809136] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1313.820222] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1313.830671] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1313.843301] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 74ea66f0-391c-437b-8aee-f784528d7963 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1313.853624] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 70ce9280-fb86-4e6a-a824-a174d44b4ec4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1313.881695] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1313.892835] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance cbf88ee7-b392-46d5-8645-2b3bea0a53d6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1313.902517] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1313.902745] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1313.902892] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1314.086825] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4d697db-186e-401c-964c-c1ff111f0d49 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.094576] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2198450f-6805-4c3d-845c-5775e9e74435 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.125491] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-212094a2-3a14-4ee6-ab8b-8224a047bc71 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.132648] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0550fd7-5865-41df-b1ad-431e6426a5a4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.145728] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1314.153990] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1314.168911] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1314.169123] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.436s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1315.169799] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1315.170256] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1315.170256] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1315.190898] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.191075] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.191218] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.191346] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.191470] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.191593] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.191714] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.191830] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.191973] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.192128] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1315.192255] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1315.192753] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1315.192928] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1315.193075] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1315.676922] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1316.658369] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1316.849432] env[67169]: WARNING oslo_vmware.rw_handles [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1316.849432] env[67169]: ERROR oslo_vmware.rw_handles [ 1316.849913] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/119d1432-bc42-4591-8720-ea8e63065f7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1316.851850] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1316.852173] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Copying Virtual Disk [datastore2] vmware_temp/119d1432-bc42-4591-8720-ea8e63065f7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/119d1432-bc42-4591-8720-ea8e63065f7b/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1316.852463] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-45f2a0df-d748-46b8-ad7e-ff3f7a65c74c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.860851] env[67169]: DEBUG oslo_vmware.api [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 1316.860851] env[67169]: value = "task-2819181" [ 1316.860851] env[67169]: _type = "Task" [ 1316.860851] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1316.869271] env[67169]: DEBUG oslo_vmware.api [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819181, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1317.371538] env[67169]: DEBUG oslo_vmware.exceptions [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1317.371869] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1317.372441] env[67169]: ERROR nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1317.372441] env[67169]: Faults: ['InvalidArgument'] [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Traceback (most recent call last): [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] yield resources [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] self.driver.spawn(context, instance, image_meta, [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] self._fetch_image_if_missing(context, vi) [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] image_cache(vi, tmp_image_ds_loc) [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] vm_util.copy_virtual_disk( [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] session._wait_for_task(vmdk_copy_task) [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] return self.wait_for_task(task_ref) [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] return evt.wait() [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] result = hub.switch() [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] return self.greenlet.switch() [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] self.f(*self.args, **self.kw) [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] raise exceptions.translate_fault(task_info.error) [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Faults: ['InvalidArgument'] [ 1317.372441] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] [ 1317.373650] env[67169]: INFO nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Terminating instance [ 1317.374418] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1317.374678] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1317.374971] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-611bf72b-1873-41e9-863f-c9971944ee22 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.377280] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1317.377528] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1317.378247] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6f00e60-9755-48ea-b875-e7f3c6bb2aaa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.385385] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1317.386438] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d0c0c1e9-92bf-460d-a0a7-d701cedb1d4c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.387796] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1317.387986] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1317.388660] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8b323dd5-77e6-4898-a5ce-8514762780b0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.393163] env[67169]: DEBUG oslo_vmware.api [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Waiting for the task: (returnval){ [ 1317.393163] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5287d110-c8f1-1377-b837-6e8b7d83337b" [ 1317.393163] env[67169]: _type = "Task" [ 1317.393163] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1317.400122] env[67169]: DEBUG oslo_vmware.api [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5287d110-c8f1-1377-b837-6e8b7d83337b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1317.454142] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1317.454401] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1317.454592] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleting the datastore file [datastore2] 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1317.455688] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-722a6899-0b20-4f96-82a0-5d083b0f864e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.460780] env[67169]: DEBUG oslo_vmware.api [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 1317.460780] env[67169]: value = "task-2819183" [ 1317.460780] env[67169]: _type = "Task" [ 1317.460780] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1317.468592] env[67169]: DEBUG oslo_vmware.api [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819183, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1317.658496] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1317.903050] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1317.903335] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Creating directory with path [datastore2] vmware_temp/ab0f8100-b53f-459a-9a1e-58298e63a9e0/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1317.903560] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b720ae31-95d5-4320-9919-438681903675 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.915013] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Created directory with path [datastore2] vmware_temp/ab0f8100-b53f-459a-9a1e-58298e63a9e0/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1317.915125] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Fetch image to [datastore2] vmware_temp/ab0f8100-b53f-459a-9a1e-58298e63a9e0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1317.915251] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/ab0f8100-b53f-459a-9a1e-58298e63a9e0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1317.915950] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b69fff2-7de6-4131-ba64-d56b8e945f02 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.922506] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2075d9f6-b3bd-4e4d-a916-15b756d8cd2d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.931948] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf43369-8ca3-4b00-a9dd-8614af31881d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.966362] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b24c500d-5483-4111-83f9-80f4f78dd722 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.973693] env[67169]: DEBUG oslo_vmware.api [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819183, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071602} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1317.975130] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1317.975337] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1317.975514] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1317.975683] env[67169]: INFO nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1317.977396] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-de474281-51b2-473b-858c-9ae5b0c1af0b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.979245] env[67169]: DEBUG nova.compute.claims [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1317.979412] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1317.979625] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.003688] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1318.196981] env[67169]: DEBUG oslo_vmware.rw_handles [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ab0f8100-b53f-459a-9a1e-58298e63a9e0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1318.258047] env[67169]: DEBUG oslo_vmware.rw_handles [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1318.258198] env[67169]: DEBUG oslo_vmware.rw_handles [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ab0f8100-b53f-459a-9a1e-58298e63a9e0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1318.266700] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feb567c9-6415-451b-8de2-2055a2f29977 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.275018] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e11fdc01-1485-457a-8009-26e3f1dc15d9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.304881] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49783c97-1dba-4faa-8c9c-c942b5ffb5cb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.311764] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef30197c-fd03-48c2-b903-2485d29ebb76 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.324634] env[67169]: DEBUG nova.compute.provider_tree [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1318.332654] env[67169]: DEBUG nova.scheduler.client.report [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1318.346405] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.367s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.346939] env[67169]: ERROR nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1318.346939] env[67169]: Faults: ['InvalidArgument'] [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Traceback (most recent call last): [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] self.driver.spawn(context, instance, image_meta, [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] self._fetch_image_if_missing(context, vi) [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] image_cache(vi, tmp_image_ds_loc) [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] vm_util.copy_virtual_disk( [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] session._wait_for_task(vmdk_copy_task) [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] return self.wait_for_task(task_ref) [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] return evt.wait() [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] result = hub.switch() [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] return self.greenlet.switch() [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] self.f(*self.args, **self.kw) [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] raise exceptions.translate_fault(task_info.error) [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Faults: ['InvalidArgument'] [ 1318.346939] env[67169]: ERROR nova.compute.manager [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] [ 1318.347837] env[67169]: DEBUG nova.compute.utils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1318.348978] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Build of instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b was re-scheduled: A specified parameter was not correct: fileType [ 1318.348978] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1318.349367] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1318.349541] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1318.349709] env[67169]: DEBUG nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1318.349868] env[67169]: DEBUG nova.network.neutron [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1318.648896] env[67169]: DEBUG nova.network.neutron [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1318.659713] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1318.660110] env[67169]: INFO nova.compute.manager [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Took 0.31 seconds to deallocate network for instance. [ 1318.753031] env[67169]: INFO nova.scheduler.client.report [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleted allocations for instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b [ 1318.777226] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ec8ff093-ec56-4835-a968-0cce4238d83b tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 690.386s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.778674] env[67169]: DEBUG oslo_concurrency.lockutils [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 492.306s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.778927] env[67169]: DEBUG oslo_concurrency.lockutils [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1318.779160] env[67169]: DEBUG oslo_concurrency.lockutils [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.779361] env[67169]: DEBUG oslo_concurrency.lockutils [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.782715] env[67169]: INFO nova.compute.manager [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Terminating instance [ 1318.785079] env[67169]: DEBUG oslo_concurrency.lockutils [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "refresh_cache-1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1318.785079] env[67169]: DEBUG oslo_concurrency.lockutils [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "refresh_cache-1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1318.785079] env[67169]: DEBUG nova.network.neutron [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1318.792401] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 4f57a0db-fe0b-4983-9e07-62485a53f918] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1318.821040] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 4f57a0db-fe0b-4983-9e07-62485a53f918] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1318.832071] env[67169]: DEBUG nova.network.neutron [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1318.850462] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "4f57a0db-fe0b-4983-9e07-62485a53f918" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.630s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.867807] env[67169]: DEBUG nova.compute.manager [None req-4a549810-e7cd-467a-a3e5-26aa35a98b4a tempest-ServerRescueTestJSONUnderV235-1040427181 tempest-ServerRescueTestJSONUnderV235-1040427181-project-member] [instance: 3930edcb-c8ce-44f4-84ae-a2b59f99bc82] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1318.892910] env[67169]: DEBUG nova.compute.manager [None req-4a549810-e7cd-467a-a3e5-26aa35a98b4a tempest-ServerRescueTestJSONUnderV235-1040427181 tempest-ServerRescueTestJSONUnderV235-1040427181-project-member] [instance: 3930edcb-c8ce-44f4-84ae-a2b59f99bc82] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1318.913935] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a549810-e7cd-467a-a3e5-26aa35a98b4a tempest-ServerRescueTestJSONUnderV235-1040427181 tempest-ServerRescueTestJSONUnderV235-1040427181-project-member] Lock "3930edcb-c8ce-44f4-84ae-a2b59f99bc82" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.342s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.922857] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1318.982143] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1318.982410] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.984227] env[67169]: INFO nova.compute.claims [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1319.043252] env[67169]: DEBUG nova.network.neutron [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1319.052722] env[67169]: DEBUG oslo_concurrency.lockutils [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "refresh_cache-1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1319.053289] env[67169]: DEBUG nova.compute.manager [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1319.053377] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1319.053799] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-04258f48-5c7e-45e0-a22b-1d1c5cffc8e2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.067037] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda15c4e-dce6-4a59-a4dc-2b7690e7319d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.098489] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b could not be found. [ 1319.098705] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1319.098883] env[67169]: INFO nova.compute.manager [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1319.099144] env[67169]: DEBUG oslo.service.loopingcall [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1319.101640] env[67169]: DEBUG nova.compute.manager [-] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1319.101741] env[67169]: DEBUG nova.network.neutron [-] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1319.245937] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9555e80-c9be-4485-a3cd-ed43e1f9c567 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.253406] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b7de5ae-e2d9-496c-b594-72d223360fd5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.284100] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-357e0efd-a9e2-4442-8109-acdde2cac2aa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.291398] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d25dc428-e4b7-4780-a4f5-56f40093124e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.303999] env[67169]: DEBUG nova.compute.provider_tree [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1319.314202] env[67169]: DEBUG nova.scheduler.client.report [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1319.323467] env[67169]: DEBUG nova.network.neutron [-] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1319.327218] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1319.327759] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1319.331228] env[67169]: DEBUG nova.network.neutron [-] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1319.338938] env[67169]: INFO nova.compute.manager [-] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] Took 0.24 seconds to deallocate network for instance. [ 1319.370817] env[67169]: DEBUG nova.compute.utils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1319.372254] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1319.372454] env[67169]: DEBUG nova.network.neutron [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1319.382153] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1319.461458] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1319.472498] env[67169]: DEBUG oslo_concurrency.lockutils [None req-070646a8-5a1a-445f-82cd-3048606b4760 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.694s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1319.474636] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 133.773s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1319.474636] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1e43c263-c527-4349-8e9c-3f4a3ffc9d8b] During sync_power_state the instance has a pending task (deleting). Skip. [ 1319.474636] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "1e43c263-c527-4349-8e9c-3f4a3ffc9d8b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1319.488989] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1319.489245] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1319.489405] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1319.489590] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1319.489829] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1319.489988] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1319.490097] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1319.490261] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1319.490430] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1319.490586] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1319.490759] env[67169]: DEBUG nova.virt.hardware [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1319.491830] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abd17d51-11a7-49c2-8dc7-36d99d6fc8af {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.499710] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6535b353-7f96-4ba8-b88f-5d4fcd09d00f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.504919] env[67169]: DEBUG nova.policy [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '20cff242a3704b259c7541b534416f7f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3655a09faf0b4b8da009dea9b26d6e47', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1319.893605] env[67169]: DEBUG nova.network.neutron [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Successfully created port: 6205e2c6-b9ff-4afa-8e07-f1363b5f3501 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1320.800556] env[67169]: DEBUG nova.compute.manager [req-80788a89-4392-468d-ac26-48f41280b7a1 req-94244349-e339-4441-983f-c8ac365a8172 service nova] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Received event network-vif-plugged-6205e2c6-b9ff-4afa-8e07-f1363b5f3501 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1320.800819] env[67169]: DEBUG oslo_concurrency.lockutils [req-80788a89-4392-468d-ac26-48f41280b7a1 req-94244349-e339-4441-983f-c8ac365a8172 service nova] Acquiring lock "883a792f-ae72-4475-8592-3076c2c2c2ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1320.801042] env[67169]: DEBUG oslo_concurrency.lockutils [req-80788a89-4392-468d-ac26-48f41280b7a1 req-94244349-e339-4441-983f-c8ac365a8172 service nova] Lock "883a792f-ae72-4475-8592-3076c2c2c2ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1320.801221] env[67169]: DEBUG oslo_concurrency.lockutils [req-80788a89-4392-468d-ac26-48f41280b7a1 req-94244349-e339-4441-983f-c8ac365a8172 service nova] Lock "883a792f-ae72-4475-8592-3076c2c2c2ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1320.801392] env[67169]: DEBUG nova.compute.manager [req-80788a89-4392-468d-ac26-48f41280b7a1 req-94244349-e339-4441-983f-c8ac365a8172 service nova] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] No waiting events found dispatching network-vif-plugged-6205e2c6-b9ff-4afa-8e07-f1363b5f3501 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1320.801553] env[67169]: WARNING nova.compute.manager [req-80788a89-4392-468d-ac26-48f41280b7a1 req-94244349-e339-4441-983f-c8ac365a8172 service nova] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Received unexpected event network-vif-plugged-6205e2c6-b9ff-4afa-8e07-f1363b5f3501 for instance with vm_state building and task_state spawning. [ 1320.830920] env[67169]: DEBUG nova.network.neutron [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Successfully updated port: 6205e2c6-b9ff-4afa-8e07-f1363b5f3501 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1320.846736] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquiring lock "refresh_cache-883a792f-ae72-4475-8592-3076c2c2c2ae" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1320.847276] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquired lock "refresh_cache-883a792f-ae72-4475-8592-3076c2c2c2ae" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1320.847521] env[67169]: DEBUG nova.network.neutron [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1320.914414] env[67169]: DEBUG nova.network.neutron [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1321.095429] env[67169]: DEBUG nova.network.neutron [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Updating instance_info_cache with network_info: [{"id": "6205e2c6-b9ff-4afa-8e07-f1363b5f3501", "address": "fa:16:3e:e0:50:60", "network": {"id": "fdbaef5e-d098-474b-9688-0cc2635dd600", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1375181218-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3655a09faf0b4b8da009dea9b26d6e47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8039f411-8c97-48fe-a5a9-9f5a42e4e7c6", "external-id": "nsx-vlan-transportzone-12", "segmentation_id": 12, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6205e2c6-b9", "ovs_interfaceid": "6205e2c6-b9ff-4afa-8e07-f1363b5f3501", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1321.105793] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Releasing lock "refresh_cache-883a792f-ae72-4475-8592-3076c2c2c2ae" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1321.106082] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Instance network_info: |[{"id": "6205e2c6-b9ff-4afa-8e07-f1363b5f3501", "address": "fa:16:3e:e0:50:60", "network": {"id": "fdbaef5e-d098-474b-9688-0cc2635dd600", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1375181218-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3655a09faf0b4b8da009dea9b26d6e47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8039f411-8c97-48fe-a5a9-9f5a42e4e7c6", "external-id": "nsx-vlan-transportzone-12", "segmentation_id": 12, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6205e2c6-b9", "ovs_interfaceid": "6205e2c6-b9ff-4afa-8e07-f1363b5f3501", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1321.106453] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e0:50:60', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8039f411-8c97-48fe-a5a9-9f5a42e4e7c6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6205e2c6-b9ff-4afa-8e07-f1363b5f3501', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1321.114351] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Creating folder: Project (3655a09faf0b4b8da009dea9b26d6e47). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1321.114821] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7e95eb21-ed5b-43ac-8e04-2dad54aa5076 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.125808] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Created folder: Project (3655a09faf0b4b8da009dea9b26d6e47) in parent group-v566843. [ 1321.126009] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Creating folder: Instances. Parent ref: group-v566921. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1321.126262] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-54d88585-418f-4d99-a44c-3a6dec0e5991 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.134797] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Created folder: Instances in parent group-v566921. [ 1321.135025] env[67169]: DEBUG oslo.service.loopingcall [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1321.135208] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1321.135396] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4f3bc0f2-9ea2-42b1-8af3-9a39112385ce {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.153836] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1321.153836] env[67169]: value = "task-2819186" [ 1321.153836] env[67169]: _type = "Task" [ 1321.153836] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1321.160966] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819186, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1321.588740] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1321.588740] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1321.666044] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819186, 'name': CreateVM_Task, 'duration_secs': 0.281644} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1321.666044] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1321.666346] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1321.666555] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1321.666881] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1321.667140] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-32c0ede0-06be-4580-a985-0570d3b47a15 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1321.671243] env[67169]: DEBUG oslo_vmware.api [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Waiting for the task: (returnval){ [ 1321.671243] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ee3459-7dbd-3c65-3bd3-47827d000be7" [ 1321.671243] env[67169]: _type = "Task" [ 1321.671243] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1321.680412] env[67169]: DEBUG oslo_vmware.api [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ee3459-7dbd-3c65-3bd3-47827d000be7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1321.883932] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquiring lock "883a792f-ae72-4475-8592-3076c2c2c2ae" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1322.182067] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1322.182414] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1322.182551] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1322.936498] env[67169]: DEBUG nova.compute.manager [req-c4c12283-7ee8-40c9-adf3-df317d1d82e4 req-56ff9fd7-88cf-45f5-8621-0493a3ef0022 service nova] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Received event network-changed-6205e2c6-b9ff-4afa-8e07-f1363b5f3501 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1322.936783] env[67169]: DEBUG nova.compute.manager [req-c4c12283-7ee8-40c9-adf3-df317d1d82e4 req-56ff9fd7-88cf-45f5-8621-0493a3ef0022 service nova] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Refreshing instance network info cache due to event network-changed-6205e2c6-b9ff-4afa-8e07-f1363b5f3501. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1322.937014] env[67169]: DEBUG oslo_concurrency.lockutils [req-c4c12283-7ee8-40c9-adf3-df317d1d82e4 req-56ff9fd7-88cf-45f5-8621-0493a3ef0022 service nova] Acquiring lock "refresh_cache-883a792f-ae72-4475-8592-3076c2c2c2ae" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1322.937136] env[67169]: DEBUG oslo_concurrency.lockutils [req-c4c12283-7ee8-40c9-adf3-df317d1d82e4 req-56ff9fd7-88cf-45f5-8621-0493a3ef0022 service nova] Acquired lock "refresh_cache-883a792f-ae72-4475-8592-3076c2c2c2ae" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1322.937238] env[67169]: DEBUG nova.network.neutron [req-c4c12283-7ee8-40c9-adf3-df317d1d82e4 req-56ff9fd7-88cf-45f5-8621-0493a3ef0022 service nova] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Refreshing network info cache for port 6205e2c6-b9ff-4afa-8e07-f1363b5f3501 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1323.355223] env[67169]: DEBUG nova.network.neutron [req-c4c12283-7ee8-40c9-adf3-df317d1d82e4 req-56ff9fd7-88cf-45f5-8621-0493a3ef0022 service nova] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Updated VIF entry in instance network info cache for port 6205e2c6-b9ff-4afa-8e07-f1363b5f3501. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1323.355599] env[67169]: DEBUG nova.network.neutron [req-c4c12283-7ee8-40c9-adf3-df317d1d82e4 req-56ff9fd7-88cf-45f5-8621-0493a3ef0022 service nova] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Updating instance_info_cache with network_info: [{"id": "6205e2c6-b9ff-4afa-8e07-f1363b5f3501", "address": "fa:16:3e:e0:50:60", "network": {"id": "fdbaef5e-d098-474b-9688-0cc2635dd600", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1375181218-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3655a09faf0b4b8da009dea9b26d6e47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8039f411-8c97-48fe-a5a9-9f5a42e4e7c6", "external-id": "nsx-vlan-transportzone-12", "segmentation_id": 12, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6205e2c6-b9", "ovs_interfaceid": "6205e2c6-b9ff-4afa-8e07-f1363b5f3501", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1323.366748] env[67169]: DEBUG oslo_concurrency.lockutils [req-c4c12283-7ee8-40c9-adf3-df317d1d82e4 req-56ff9fd7-88cf-45f5-8621-0493a3ef0022 service nova] Releasing lock "refresh_cache-883a792f-ae72-4475-8592-3076c2c2c2ae" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1363.919055] env[67169]: WARNING oslo_vmware.rw_handles [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1363.919055] env[67169]: ERROR oslo_vmware.rw_handles [ 1363.919716] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/ab0f8100-b53f-459a-9a1e-58298e63a9e0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1363.921785] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1363.922034] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Copying Virtual Disk [datastore2] vmware_temp/ab0f8100-b53f-459a-9a1e-58298e63a9e0/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/ab0f8100-b53f-459a-9a1e-58298e63a9e0/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1363.922352] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dcce0509-74e7-4e15-88b2-506963171134 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.931212] env[67169]: DEBUG oslo_vmware.api [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Waiting for the task: (returnval){ [ 1363.931212] env[67169]: value = "task-2819187" [ 1363.931212] env[67169]: _type = "Task" [ 1363.931212] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.939300] env[67169]: DEBUG oslo_vmware.api [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Task: {'id': task-2819187, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1364.441258] env[67169]: DEBUG oslo_vmware.exceptions [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1364.441606] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1364.443669] env[67169]: ERROR nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1364.443669] env[67169]: Faults: ['InvalidArgument'] [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Traceback (most recent call last): [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] yield resources [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] self.driver.spawn(context, instance, image_meta, [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] self._fetch_image_if_missing(context, vi) [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] image_cache(vi, tmp_image_ds_loc) [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] vm_util.copy_virtual_disk( [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] session._wait_for_task(vmdk_copy_task) [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] return self.wait_for_task(task_ref) [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] return evt.wait() [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] result = hub.switch() [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] return self.greenlet.switch() [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] self.f(*self.args, **self.kw) [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] raise exceptions.translate_fault(task_info.error) [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Faults: ['InvalidArgument'] [ 1364.443669] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] [ 1364.443669] env[67169]: INFO nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Terminating instance [ 1364.445077] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1364.445704] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1364.445704] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-25fa61fe-fdaa-48ef-b9a6-03b6af8cad71 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.448075] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1364.448251] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1364.448986] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afe8fe63-9734-46f4-9f3f-94d5a31fb209 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.455965] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1364.456275] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7fb958b0-f1db-467a-a223-f32c540ae78a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.458630] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1364.458713] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1364.459679] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6e071819-fd6c-48a2-9b14-9c61ce92866a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.466009] env[67169]: DEBUG oslo_vmware.api [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for the task: (returnval){ [ 1364.466009] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52238527-9380-a13f-9926-29c3479a8925" [ 1364.466009] env[67169]: _type = "Task" [ 1364.466009] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1364.473257] env[67169]: DEBUG oslo_vmware.api [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52238527-9380-a13f-9926-29c3479a8925, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1364.527777] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1364.528012] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1364.528207] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Deleting the datastore file [datastore2] 47ffcce9-3afc-41be-b38e-dacfeb535a2c {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1364.528472] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9c330dd7-e954-47d2-8d34-89b035562d5b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.536063] env[67169]: DEBUG oslo_vmware.api [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Waiting for the task: (returnval){ [ 1364.536063] env[67169]: value = "task-2819189" [ 1364.536063] env[67169]: _type = "Task" [ 1364.536063] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1364.542762] env[67169]: DEBUG oslo_vmware.api [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Task: {'id': task-2819189, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1364.977016] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1364.977340] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Creating directory with path [datastore2] vmware_temp/f1246bfe-93d9-44da-bbfa-a26d4d33f6f3/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1364.977528] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6de5e0b8-0711-4338-bb07-87dbbe961fa9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.988519] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Created directory with path [datastore2] vmware_temp/f1246bfe-93d9-44da-bbfa-a26d4d33f6f3/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1364.988697] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Fetch image to [datastore2] vmware_temp/f1246bfe-93d9-44da-bbfa-a26d4d33f6f3/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1364.988862] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/f1246bfe-93d9-44da-bbfa-a26d4d33f6f3/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1364.989576] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c859f86-457c-49a7-ab2a-65598b5f76b5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.995920] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28d98eba-47cc-4b62-8d3a-8579a419694c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.004531] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f4577e6-303b-4b72-bb22-b7801487f6ce {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.034776] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-842ff3af-07e3-4fc2-8945-808d9088529f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.045905] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8411c771-bf6a-4431-b9e5-0c196888ac40 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.048469] env[67169]: DEBUG oslo_vmware.api [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Task: {'id': task-2819189, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076396} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1365.048702] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1365.048883] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1365.049169] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1365.049232] env[67169]: INFO nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1365.051303] env[67169]: DEBUG nova.compute.claims [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1365.051487] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.051701] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.069776] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1365.184166] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1365.186063] env[67169]: ERROR nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] result = getattr(controller, method)(*args, **kwargs) [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._get(image_id) [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] resp, body = self.http_client.get(url, headers=header) [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.request(url, 'GET', **kwargs) [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._handle_response(resp) [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise exc.from_response(resp, resp.content) [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] During handling of the above exception, another exception occurred: [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] yield resources [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self.driver.spawn(context, instance, image_meta, [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1365.186063] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._fetch_image_if_missing(context, vi) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] image_fetch(context, vi, tmp_image_ds_loc) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] images.fetch_image( [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] metadata = IMAGE_API.get(context, image_ref) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return session.show(context, image_id, [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] _reraise_translated_image_exception(image_id) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise new_exc.with_traceback(exc_trace) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] result = getattr(controller, method)(*args, **kwargs) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._get(image_id) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] resp, body = self.http_client.get(url, headers=header) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.request(url, 'GET', **kwargs) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._handle_response(resp) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise exc.from_response(resp, resp.content) [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1365.186823] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1365.186823] env[67169]: INFO nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Terminating instance [ 1365.187876] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1365.188129] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1365.188752] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1365.188945] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1365.191245] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a9dbe103-5a2f-44ee-8e5f-93d372edc0da {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.195207] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ce9344b-363e-4aa7-8ec5-49268f9cbd32 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.202145] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1365.202365] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ebab507f-eee6-4a5d-98c0-fe5caf21c7cf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.204610] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1365.204786] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1365.205740] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-46b8187f-695a-4f5b-9815-caa3a9a5bf00 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.212654] env[67169]: DEBUG oslo_vmware.api [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 1365.212654] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5281d3b0-25d1-1a81-e1de-a70b3682a286" [ 1365.212654] env[67169]: _type = "Task" [ 1365.212654] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1365.220306] env[67169]: DEBUG oslo_vmware.api [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5281d3b0-25d1-1a81-e1de-a70b3682a286, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1365.276366] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1365.276366] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1365.276506] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Deleting the datastore file [datastore2] 1f0f1960-0c77-4e72-86ee-807819e75d2a {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1365.276685] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1f9b1fb2-403d-4bfa-b62b-c55d0fb64fd3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.284037] env[67169]: DEBUG oslo_vmware.api [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for the task: (returnval){ [ 1365.284037] env[67169]: value = "task-2819191" [ 1365.284037] env[67169]: _type = "Task" [ 1365.284037] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1365.291668] env[67169]: DEBUG oslo_vmware.api [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Task: {'id': task-2819191, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1365.315172] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e5edbce-ee9e-44f0-bbf0-f87db4e5694f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.322015] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2076ddc4-f7d1-4259-bee7-3a12b00589e1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.352283] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6071c21-75d9-4438-884d-163500125010 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.359420] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fc91ac1-ade2-4c7b-bb34-2162600a0dd2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.373051] env[67169]: DEBUG nova.compute.provider_tree [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1365.382074] env[67169]: DEBUG nova.scheduler.client.report [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1365.398021] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.346s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.398619] env[67169]: ERROR nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1365.398619] env[67169]: Faults: ['InvalidArgument'] [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Traceback (most recent call last): [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] self.driver.spawn(context, instance, image_meta, [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] self._fetch_image_if_missing(context, vi) [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] image_cache(vi, tmp_image_ds_loc) [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] vm_util.copy_virtual_disk( [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] session._wait_for_task(vmdk_copy_task) [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] return self.wait_for_task(task_ref) [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] return evt.wait() [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] result = hub.switch() [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] return self.greenlet.switch() [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] self.f(*self.args, **self.kw) [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] raise exceptions.translate_fault(task_info.error) [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Faults: ['InvalidArgument'] [ 1365.398619] env[67169]: ERROR nova.compute.manager [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] [ 1365.399456] env[67169]: DEBUG nova.compute.utils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1365.400795] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Build of instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c was re-scheduled: A specified parameter was not correct: fileType [ 1365.400795] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1365.401269] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1365.401417] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1365.401591] env[67169]: DEBUG nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1365.401766] env[67169]: DEBUG nova.network.neutron [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1365.723329] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1365.723606] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating directory with path [datastore2] vmware_temp/e006ccee-ffff-4ce7-8ccf-5d485938bc54/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1365.723874] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4af03c45-0445-4b37-93b2-2d9de43cbbc5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.735144] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Created directory with path [datastore2] vmware_temp/e006ccee-ffff-4ce7-8ccf-5d485938bc54/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1365.735274] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Fetch image to [datastore2] vmware_temp/e006ccee-ffff-4ce7-8ccf-5d485938bc54/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1365.735357] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/e006ccee-ffff-4ce7-8ccf-5d485938bc54/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1365.736143] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e35c35a3-4cf4-4cbe-9765-9186364558ea {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.744300] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e540cc14-1934-49ae-a410-f8cf64ac7d3f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.753487] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a33fa995-be33-4f49-924b-a188103062de {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.789168] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67196873-5444-4ee0-bc40-1aa7500a7abb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.800754] env[67169]: DEBUG oslo_vmware.api [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Task: {'id': task-2819191, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06956} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1365.801318] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1365.801524] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1365.801705] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1365.801884] env[67169]: INFO nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1365.803507] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-df22462e-2806-40c1-aa2b-f9cf3ec7bd4b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.805421] env[67169]: DEBUG nova.compute.claims [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1365.805638] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.805901] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.831046] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1365.894986] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e006ccee-ffff-4ce7-8ccf-5d485938bc54/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1365.976515] env[67169]: DEBUG nova.network.neutron [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1365.983174] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1365.983174] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e006ccee-ffff-4ce7-8ccf-5d485938bc54/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1365.991254] env[67169]: INFO nova.compute.manager [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Took 0.59 seconds to deallocate network for instance. [ 1366.096810] env[67169]: INFO nova.scheduler.client.report [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Deleted allocations for instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c [ 1366.116165] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc42a747-6b10-42c3-8e2b-0674634135bb tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 683.069s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.117604] env[67169]: DEBUG oslo_concurrency.lockutils [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 486.286s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1366.117844] env[67169]: DEBUG oslo_concurrency.lockutils [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Acquiring lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1366.118070] env[67169]: DEBUG oslo_concurrency.lockutils [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1366.118254] env[67169]: DEBUG oslo_concurrency.lockutils [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.120822] env[67169]: INFO nova.compute.manager [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Terminating instance [ 1366.122663] env[67169]: DEBUG nova.compute.manager [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1366.122869] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1366.123365] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-84483eeb-79e0-4161-9d30-21628be2ba88 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.137173] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b384470a-7db8-41f6-b5cc-7dd385365185 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.148794] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1366.174884] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 47ffcce9-3afc-41be-b38e-dacfeb535a2c could not be found. [ 1366.175146] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1366.175370] env[67169]: INFO nova.compute.manager [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1366.175677] env[67169]: DEBUG oslo.service.loopingcall [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1366.175957] env[67169]: DEBUG nova.compute.manager [-] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1366.176116] env[67169]: DEBUG nova.network.neutron [-] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1366.199633] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c618039-6375-4599-b244-8d3706bb13e3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.203724] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1366.204024] env[67169]: DEBUG nova.network.neutron [-] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1366.210059] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-211472a4-285b-48dd-a353-2ecdff38af25 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.213624] env[67169]: INFO nova.compute.manager [-] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] Took 0.04 seconds to deallocate network for instance. [ 1366.245269] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ba58f18-e199-4be9-98af-0f8ba51baae8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.255870] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0650f4d1-225f-40e3-8dd6-2a03891c33d5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.272520] env[67169]: DEBUG nova.compute.provider_tree [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1366.279984] env[67169]: DEBUG nova.scheduler.client.report [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1366.299870] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.494s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.300609] env[67169]: ERROR nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] result = getattr(controller, method)(*args, **kwargs) [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._get(image_id) [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] resp, body = self.http_client.get(url, headers=header) [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.request(url, 'GET', **kwargs) [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._handle_response(resp) [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise exc.from_response(resp, resp.content) [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] During handling of the above exception, another exception occurred: [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self.driver.spawn(context, instance, image_meta, [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._fetch_image_if_missing(context, vi) [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1366.300609] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] image_fetch(context, vi, tmp_image_ds_loc) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] images.fetch_image( [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] metadata = IMAGE_API.get(context, image_ref) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return session.show(context, image_id, [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] _reraise_translated_image_exception(image_id) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise new_exc.with_traceback(exc_trace) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] result = getattr(controller, method)(*args, **kwargs) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._get(image_id) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] resp, body = self.http_client.get(url, headers=header) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.request(url, 'GET', **kwargs) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._handle_response(resp) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise exc.from_response(resp, resp.content) [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1366.301324] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.301324] env[67169]: DEBUG nova.compute.utils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1366.302366] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.099s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1366.303924] env[67169]: INFO nova.compute.claims [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1366.306887] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Build of instance 1f0f1960-0c77-4e72-86ee-807819e75d2a was re-scheduled: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1366.307378] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1366.307551] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1366.307708] env[67169]: DEBUG nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1366.307869] env[67169]: DEBUG nova.network.neutron [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1366.312689] env[67169]: DEBUG oslo_concurrency.lockutils [None req-977b05a3-2344-4cd2-8da9-6153c93e0a28 tempest-ServerPasswordTestJSON-586101290 tempest-ServerPasswordTestJSON-586101290-project-member] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.314329] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 180.614s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1366.314510] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 47ffcce9-3afc-41be-b38e-dacfeb535a2c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1366.314840] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "47ffcce9-3afc-41be-b38e-dacfeb535a2c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.447256] env[67169]: DEBUG neutronclient.v2_0.client [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67169) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1366.448493] env[67169]: ERROR nova.compute.manager [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] result = getattr(controller, method)(*args, **kwargs) [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._get(image_id) [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] resp, body = self.http_client.get(url, headers=header) [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.request(url, 'GET', **kwargs) [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._handle_response(resp) [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise exc.from_response(resp, resp.content) [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] During handling of the above exception, another exception occurred: [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self.driver.spawn(context, instance, image_meta, [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._fetch_image_if_missing(context, vi) [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1366.448493] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] image_fetch(context, vi, tmp_image_ds_loc) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] images.fetch_image( [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] metadata = IMAGE_API.get(context, image_ref) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return session.show(context, image_id, [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] _reraise_translated_image_exception(image_id) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise new_exc.with_traceback(exc_trace) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] result = getattr(controller, method)(*args, **kwargs) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._get(image_id) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] resp, body = self.http_client.get(url, headers=header) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.request(url, 'GET', **kwargs) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self._handle_response(resp) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise exc.from_response(resp, resp.content) [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] During handling of the above exception, another exception occurred: [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._build_and_run_instance(context, instance, image, [ 1366.449291] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise exception.RescheduledException( [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] nova.exception.RescheduledException: Build of instance 1f0f1960-0c77-4e72-86ee-807819e75d2a was re-scheduled: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] During handling of the above exception, another exception occurred: [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] exception_handler_v20(status_code, error_body) [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise client_exc(message=error_message, [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Neutron server returns request_ids: ['req-dac28ae1-bb24-4a64-96ac-21f9abfbcf97'] [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] During handling of the above exception, another exception occurred: [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._deallocate_network(context, instance, requested_networks) [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self.network_api.deallocate_for_instance( [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] data = neutron.list_ports(**search_opts) [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.list('ports', self.ports_path, retrieve_all, [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] for r in self._pagination(collection, path, **params): [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1366.450134] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] res = self.get(path, params=params) [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.retry_request("GET", action, body=body, [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.do_request(method, action, body=body, [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._handle_fault_response(status_code, replybody, resp) [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise exception.Unauthorized() [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] nova.exception.Unauthorized: Not authorized. [ 1366.451514] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1366.499420] env[67169]: INFO nova.scheduler.client.report [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Deleted allocations for instance 1f0f1960-0c77-4e72-86ee-807819e75d2a [ 1366.515391] env[67169]: DEBUG oslo_concurrency.lockutils [None req-3f08fb17-986a-4906-8db3-9cba171f3daf tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 640.016s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.516680] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 443.121s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1366.517089] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Acquiring lock "1f0f1960-0c77-4e72-86ee-807819e75d2a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1366.517178] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1366.517289] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.521311] env[67169]: INFO nova.compute.manager [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Terminating instance [ 1366.522937] env[67169]: DEBUG nova.compute.manager [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1366.523149] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1366.523591] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ee3c481b-a302-469b-b5f1-ef0fb4329451 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.532386] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-045bc493-2c28-46cf-a058-88c19791b1c9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.546059] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1366.565982] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1f0f1960-0c77-4e72-86ee-807819e75d2a could not be found. [ 1366.566172] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1366.566346] env[67169]: INFO nova.compute.manager [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1366.566592] env[67169]: DEBUG oslo.service.loopingcall [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1366.569223] env[67169]: DEBUG nova.compute.manager [-] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1366.569338] env[67169]: DEBUG nova.network.neutron [-] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1366.598924] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ae362a6-0295-4e1b-a1d5-fb2acece9bd9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.605201] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1366.609939] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b28a0bd-1ff8-483f-9fc4-4f12bcdd654c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.333440] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-620b00bd-e90c-4621-840b-417212f1a37a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.341282] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a63b32ad-b94b-478c-9189-3217e60e3215 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.354199] env[67169]: DEBUG nova.compute.provider_tree [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1367.362626] env[67169]: DEBUG nova.scheduler.client.report [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1367.375832] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.073s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.376366] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1367.378895] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.775s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1367.380249] env[67169]: INFO nova.compute.claims [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1367.386009] env[67169]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67169) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1367.386237] env[67169]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-d2543060-7cf8-4e0a-b22f-93c5b58aec82'] [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.386722] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1367.387962] env[67169]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1367.387962] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1367.387962] env[67169]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1367.387962] env[67169]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.387962] env[67169]: ERROR oslo.service.loopingcall [ 1367.387962] env[67169]: ERROR nova.compute.manager [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.423847] env[67169]: DEBUG nova.compute.utils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1367.425291] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1367.425618] env[67169]: DEBUG nova.network.neutron [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1367.431636] env[67169]: ERROR nova.compute.manager [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] exception_handler_v20(status_code, error_body) [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise client_exc(message=error_message, [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Neutron server returns request_ids: ['req-d2543060-7cf8-4e0a-b22f-93c5b58aec82'] [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] During handling of the above exception, another exception occurred: [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Traceback (most recent call last): [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._delete_instance(context, instance, bdms) [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._shutdown_instance(context, instance, bdms) [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._try_deallocate_network(context, instance, requested_networks) [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] with excutils.save_and_reraise_exception(): [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self.force_reraise() [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise self.value [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] _deallocate_network_with_retries() [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return evt.wait() [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1367.431636] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] result = hub.switch() [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.greenlet.switch() [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] result = func(*self.args, **self.kw) [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] result = f(*args, **kwargs) [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._deallocate_network( [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self.network_api.deallocate_for_instance( [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] data = neutron.list_ports(**search_opts) [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.list('ports', self.ports_path, retrieve_all, [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] for r in self._pagination(collection, path, **params): [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] res = self.get(path, params=params) [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.retry_request("GET", action, body=body, [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] return self.do_request(method, action, body=body, [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] ret = obj(*args, **kwargs) [ 1367.432538] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1367.433346] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] self._handle_fault_response(status_code, replybody, resp) [ 1367.433346] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1367.433346] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1367.433346] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.433346] env[67169]: ERROR nova.compute.manager [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] [ 1367.435053] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1367.460148] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.943s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.461299] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 181.761s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1367.461484] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1367.461659] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "1f0f1960-0c77-4e72-86ee-807819e75d2a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.517037] env[67169]: DEBUG nova.policy [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ce57069286b34b5da298e9b01f4bd39e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d3275803e654637b85c8f15583e2e25', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1367.518359] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1367.530625] env[67169]: INFO nova.compute.manager [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] [instance: 1f0f1960-0c77-4e72-86ee-807819e75d2a] Successfully reverted task state from None on failure for instance. [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server [None req-6adc6d6a-e56e-495d-b704-34db936e70c4 tempest-DeleteServersAdminTestJSON-214609884 tempest-DeleteServersAdminTestJSON-214609884-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-d2543060-7cf8-4e0a-b22f-93c5b58aec82'] [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.534098] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1367.535671] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.538909] env[67169]: ERROR oslo_messaging.rpc.server [ 1367.547366] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1367.548021] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1367.548021] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1367.548021] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1367.548158] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1367.548211] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1367.548414] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1367.548565] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1367.549353] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1367.549353] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1367.549353] env[67169]: DEBUG nova.virt.hardware [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1367.550770] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7bd7952-8b0f-4cc4-8689-d57f2655ea21 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.562073] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-196a177b-9ae7-4437-ba18-da5db3563f64 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.647195] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59c744ec-abc1-42e2-be19-a407b76dbecf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.654357] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92bf164c-d61c-41b0-b2ed-581461df3d4c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.659051] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.685617] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2655dad1-f0dd-46cc-84e0-bc5d81897a2c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.692716] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5cedda2-4abc-466a-8d60-56fefd0b6f69 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.705637] env[67169]: DEBUG nova.compute.provider_tree [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1367.714745] env[67169]: DEBUG nova.scheduler.client.report [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1367.728790] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.729312] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1367.765058] env[67169]: DEBUG nova.compute.utils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1367.766317] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1367.766494] env[67169]: DEBUG nova.network.neutron [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1367.777107] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1367.839715] env[67169]: DEBUG nova.policy [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5bfe53005c444608bc95c739ac2cd065', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c75e25d17f4f4e5b92e7aad2459a6392', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1367.843412] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1367.866289] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1367.866526] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1367.866686] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1367.866867] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1367.867022] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1367.867171] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1367.867371] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1367.867526] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1367.867686] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1367.867848] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1367.868035] env[67169]: DEBUG nova.virt.hardware [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1367.868878] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c750545-774c-4796-aebe-b65fcac5dc97 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.876758] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-340e53fe-1a52-4e69-906c-1a9af5bb21ca {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.193231] env[67169]: DEBUG nova.network.neutron [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Successfully created port: ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1368.296212] env[67169]: DEBUG nova.network.neutron [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Successfully created port: a3d828c6-2d0e-46c7-8613-63649dfd0ae3 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1368.770245] env[67169]: DEBUG nova.network.neutron [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Successfully updated port: ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1368.782906] env[67169]: DEBUG nova.compute.manager [req-b7a0ae24-493b-499f-9f6b-4137df4b5852 req-89b95bab-76a7-48dd-95a7-c558e12bad8e service nova] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Received event network-vif-plugged-ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1368.782906] env[67169]: DEBUG oslo_concurrency.lockutils [req-b7a0ae24-493b-499f-9f6b-4137df4b5852 req-89b95bab-76a7-48dd-95a7-c558e12bad8e service nova] Acquiring lock "74ea66f0-391c-437b-8aee-f784528d7963-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1368.782906] env[67169]: DEBUG oslo_concurrency.lockutils [req-b7a0ae24-493b-499f-9f6b-4137df4b5852 req-89b95bab-76a7-48dd-95a7-c558e12bad8e service nova] Lock "74ea66f0-391c-437b-8aee-f784528d7963-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1368.782906] env[67169]: DEBUG oslo_concurrency.lockutils [req-b7a0ae24-493b-499f-9f6b-4137df4b5852 req-89b95bab-76a7-48dd-95a7-c558e12bad8e service nova] Lock "74ea66f0-391c-437b-8aee-f784528d7963-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1368.782906] env[67169]: DEBUG nova.compute.manager [req-b7a0ae24-493b-499f-9f6b-4137df4b5852 req-89b95bab-76a7-48dd-95a7-c558e12bad8e service nova] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] No waiting events found dispatching network-vif-plugged-ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1368.783516] env[67169]: WARNING nova.compute.manager [req-b7a0ae24-493b-499f-9f6b-4137df4b5852 req-89b95bab-76a7-48dd-95a7-c558e12bad8e service nova] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Received unexpected event network-vif-plugged-ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd for instance with vm_state building and task_state spawning. [ 1368.785715] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquiring lock "refresh_cache-74ea66f0-391c-437b-8aee-f784528d7963" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1368.785988] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquired lock "refresh_cache-74ea66f0-391c-437b-8aee-f784528d7963" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1368.786440] env[67169]: DEBUG nova.network.neutron [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1368.828450] env[67169]: DEBUG nova.network.neutron [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1369.139236] env[67169]: DEBUG nova.compute.manager [req-d60f8df0-9b71-4724-ac5d-b222c224184a req-1e4c35f7-637c-48c5-a8ba-da032c896c97 service nova] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Received event network-vif-plugged-a3d828c6-2d0e-46c7-8613-63649dfd0ae3 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1369.139919] env[67169]: DEBUG oslo_concurrency.lockutils [req-d60f8df0-9b71-4724-ac5d-b222c224184a req-1e4c35f7-637c-48c5-a8ba-da032c896c97 service nova] Acquiring lock "48376572-9e3a-4579-b2d7-b8b63312fab1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1369.139919] env[67169]: DEBUG oslo_concurrency.lockutils [req-d60f8df0-9b71-4724-ac5d-b222c224184a req-1e4c35f7-637c-48c5-a8ba-da032c896c97 service nova] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1369.139919] env[67169]: DEBUG oslo_concurrency.lockutils [req-d60f8df0-9b71-4724-ac5d-b222c224184a req-1e4c35f7-637c-48c5-a8ba-da032c896c97 service nova] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1369.139919] env[67169]: DEBUG nova.compute.manager [req-d60f8df0-9b71-4724-ac5d-b222c224184a req-1e4c35f7-637c-48c5-a8ba-da032c896c97 service nova] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] No waiting events found dispatching network-vif-plugged-a3d828c6-2d0e-46c7-8613-63649dfd0ae3 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1369.140121] env[67169]: WARNING nova.compute.manager [req-d60f8df0-9b71-4724-ac5d-b222c224184a req-1e4c35f7-637c-48c5-a8ba-da032c896c97 service nova] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Received unexpected event network-vif-plugged-a3d828c6-2d0e-46c7-8613-63649dfd0ae3 for instance with vm_state building and task_state spawning. [ 1369.158859] env[67169]: DEBUG nova.network.neutron [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Updating instance_info_cache with network_info: [{"id": "ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd", "address": "fa:16:3e:ea:7e:2f", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec7d2a5c-bb", "ovs_interfaceid": "ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1369.174524] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Releasing lock "refresh_cache-74ea66f0-391c-437b-8aee-f784528d7963" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1369.174524] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Instance network_info: |[{"id": "ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd", "address": "fa:16:3e:ea:7e:2f", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec7d2a5c-bb", "ovs_interfaceid": "ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1369.174904] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ea:7e:2f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c9bc2632-36f9-4912-8782-8bbb789f909d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1369.184058] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Creating folder: Project (c75e25d17f4f4e5b92e7aad2459a6392). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1369.184058] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-77bc47e6-c09b-4eb8-9d86-3de46803e9ef {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.195915] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Created folder: Project (c75e25d17f4f4e5b92e7aad2459a6392) in parent group-v566843. [ 1369.196120] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Creating folder: Instances. Parent ref: group-v566924. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1369.196353] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7bebcabf-1bec-4f1c-813b-986e0bbb6e58 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.205136] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Created folder: Instances in parent group-v566924. [ 1369.205366] env[67169]: DEBUG oslo.service.loopingcall [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1369.206043] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1369.206304] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ba5e0f3c-5b1b-427b-aa1e-c525b3f5c070 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.226919] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1369.226919] env[67169]: value = "task-2819194" [ 1369.226919] env[67169]: _type = "Task" [ 1369.226919] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1369.235024] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819194, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1369.301099] env[67169]: DEBUG nova.network.neutron [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Successfully updated port: a3d828c6-2d0e-46c7-8613-63649dfd0ae3 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1369.313728] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "refresh_cache-48376572-9e3a-4579-b2d7-b8b63312fab1" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.313894] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired lock "refresh_cache-48376572-9e3a-4579-b2d7-b8b63312fab1" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1369.314056] env[67169]: DEBUG nova.network.neutron [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1369.397770] env[67169]: DEBUG nova.network.neutron [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1369.680390] env[67169]: DEBUG nova.network.neutron [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Updating instance_info_cache with network_info: [{"id": "a3d828c6-2d0e-46c7-8613-63649dfd0ae3", "address": "fa:16:3e:0f:48:90", "network": {"id": "4e24bc87-3a15-4231-a607-f93bb9122dca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-93817792-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3275803e654637b85c8f15583e2e25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa3d828c6-2d", "ovs_interfaceid": "a3d828c6-2d0e-46c7-8613-63649dfd0ae3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1369.694034] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Releasing lock "refresh_cache-48376572-9e3a-4579-b2d7-b8b63312fab1" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1369.694376] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Instance network_info: |[{"id": "a3d828c6-2d0e-46c7-8613-63649dfd0ae3", "address": "fa:16:3e:0f:48:90", "network": {"id": "4e24bc87-3a15-4231-a607-f93bb9122dca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-93817792-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3275803e654637b85c8f15583e2e25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa3d828c6-2d", "ovs_interfaceid": "a3d828c6-2d0e-46c7-8613-63649dfd0ae3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1369.694831] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0f:48:90', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '298bb8ef-4765-494c-b157-7a349218bd1e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a3d828c6-2d0e-46c7-8613-63649dfd0ae3', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1369.703326] env[67169]: DEBUG oslo.service.loopingcall [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1369.703913] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1369.704465] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8e62bcd8-5ba8-4f24-be94-c90e7e5c6284 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.726490] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1369.726490] env[67169]: value = "task-2819195" [ 1369.726490] env[67169]: _type = "Task" [ 1369.726490] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1369.737845] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819195, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1369.740873] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819194, 'name': CreateVM_Task, 'duration_secs': 0.295655} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1369.741140] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1369.741684] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.741850] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1369.742191] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1369.742429] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-79d5646d-8797-415e-a324-558c4c06106f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.746959] env[67169]: DEBUG oslo_vmware.api [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Waiting for the task: (returnval){ [ 1369.746959] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52168f82-1be3-82f9-f1be-0d30352bf546" [ 1369.746959] env[67169]: _type = "Task" [ 1369.746959] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1369.755908] env[67169]: DEBUG oslo_vmware.api [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52168f82-1be3-82f9-f1be-0d30352bf546, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1370.237210] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819195, 'name': CreateVM_Task, 'duration_secs': 0.487873} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1370.237477] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1370.238020] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1370.256462] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1370.256695] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1370.256907] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1370.257139] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1370.257432] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1370.257703] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ced330d7-7f44-4b0c-bb09-ded7da5491b6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1370.262240] env[67169]: DEBUG oslo_vmware.api [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 1370.262240] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a525ec-0566-1da9-b50b-858cff2116e1" [ 1370.262240] env[67169]: _type = "Task" [ 1370.262240] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1370.271055] env[67169]: DEBUG oslo_vmware.api [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a525ec-0566-1da9-b50b-858cff2116e1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1370.773096] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1370.773413] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1370.773627] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1371.034325] env[67169]: DEBUG nova.compute.manager [req-2ae84829-d5b4-4dab-9b4a-c154f03ad3f9 req-56c24396-fefd-4bab-8a0d-0471a25649da service nova] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Received event network-changed-ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1371.034538] env[67169]: DEBUG nova.compute.manager [req-2ae84829-d5b4-4dab-9b4a-c154f03ad3f9 req-56c24396-fefd-4bab-8a0d-0471a25649da service nova] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Refreshing instance network info cache due to event network-changed-ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1371.034955] env[67169]: DEBUG oslo_concurrency.lockutils [req-2ae84829-d5b4-4dab-9b4a-c154f03ad3f9 req-56c24396-fefd-4bab-8a0d-0471a25649da service nova] Acquiring lock "refresh_cache-74ea66f0-391c-437b-8aee-f784528d7963" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1371.035167] env[67169]: DEBUG oslo_concurrency.lockutils [req-2ae84829-d5b4-4dab-9b4a-c154f03ad3f9 req-56c24396-fefd-4bab-8a0d-0471a25649da service nova] Acquired lock "refresh_cache-74ea66f0-391c-437b-8aee-f784528d7963" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1371.035364] env[67169]: DEBUG nova.network.neutron [req-2ae84829-d5b4-4dab-9b4a-c154f03ad3f9 req-56c24396-fefd-4bab-8a0d-0471a25649da service nova] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Refreshing network info cache for port ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1371.214543] env[67169]: DEBUG nova.compute.manager [req-27617057-c8a3-466a-b64e-4a0d27649236 req-9d9580ce-243d-4a55-86ae-9d45d2529f87 service nova] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Received event network-changed-a3d828c6-2d0e-46c7-8613-63649dfd0ae3 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1371.214830] env[67169]: DEBUG nova.compute.manager [req-27617057-c8a3-466a-b64e-4a0d27649236 req-9d9580ce-243d-4a55-86ae-9d45d2529f87 service nova] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Refreshing instance network info cache due to event network-changed-a3d828c6-2d0e-46c7-8613-63649dfd0ae3. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1371.215097] env[67169]: DEBUG oslo_concurrency.lockutils [req-27617057-c8a3-466a-b64e-4a0d27649236 req-9d9580ce-243d-4a55-86ae-9d45d2529f87 service nova] Acquiring lock "refresh_cache-48376572-9e3a-4579-b2d7-b8b63312fab1" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1371.215250] env[67169]: DEBUG oslo_concurrency.lockutils [req-27617057-c8a3-466a-b64e-4a0d27649236 req-9d9580ce-243d-4a55-86ae-9d45d2529f87 service nova] Acquired lock "refresh_cache-48376572-9e3a-4579-b2d7-b8b63312fab1" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1371.215414] env[67169]: DEBUG nova.network.neutron [req-27617057-c8a3-466a-b64e-4a0d27649236 req-9d9580ce-243d-4a55-86ae-9d45d2529f87 service nova] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Refreshing network info cache for port a3d828c6-2d0e-46c7-8613-63649dfd0ae3 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1371.466999] env[67169]: DEBUG nova.network.neutron [req-27617057-c8a3-466a-b64e-4a0d27649236 req-9d9580ce-243d-4a55-86ae-9d45d2529f87 service nova] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Updated VIF entry in instance network info cache for port a3d828c6-2d0e-46c7-8613-63649dfd0ae3. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1371.467387] env[67169]: DEBUG nova.network.neutron [req-27617057-c8a3-466a-b64e-4a0d27649236 req-9d9580ce-243d-4a55-86ae-9d45d2529f87 service nova] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Updating instance_info_cache with network_info: [{"id": "a3d828c6-2d0e-46c7-8613-63649dfd0ae3", "address": "fa:16:3e:0f:48:90", "network": {"id": "4e24bc87-3a15-4231-a607-f93bb9122dca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-93817792-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3275803e654637b85c8f15583e2e25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa3d828c6-2d", "ovs_interfaceid": "a3d828c6-2d0e-46c7-8613-63649dfd0ae3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1371.476965] env[67169]: DEBUG oslo_concurrency.lockutils [req-27617057-c8a3-466a-b64e-4a0d27649236 req-9d9580ce-243d-4a55-86ae-9d45d2529f87 service nova] Releasing lock "refresh_cache-48376572-9e3a-4579-b2d7-b8b63312fab1" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1371.667755] env[67169]: DEBUG nova.network.neutron [req-2ae84829-d5b4-4dab-9b4a-c154f03ad3f9 req-56c24396-fefd-4bab-8a0d-0471a25649da service nova] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Updated VIF entry in instance network info cache for port ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1371.668202] env[67169]: DEBUG nova.network.neutron [req-2ae84829-d5b4-4dab-9b4a-c154f03ad3f9 req-56c24396-fefd-4bab-8a0d-0471a25649da service nova] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Updating instance_info_cache with network_info: [{"id": "ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd", "address": "fa:16:3e:ea:7e:2f", "network": {"id": "617508ba-3567-4508-96b5-a01447ece634", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c31f6504bb73492890b262ff43fdf9bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9bc2632-36f9-4912-8782-8bbb789f909d", "external-id": "nsx-vlan-transportzone-897", "segmentation_id": 897, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec7d2a5c-bb", "ovs_interfaceid": "ec7d2a5c-bb2a-4bb6-86fe-9b88154e2bcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1371.677647] env[67169]: DEBUG oslo_concurrency.lockutils [req-2ae84829-d5b4-4dab-9b4a-c154f03ad3f9 req-56c24396-fefd-4bab-8a0d-0471a25649da service nova] Releasing lock "refresh_cache-74ea66f0-391c-437b-8aee-f784528d7963" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1373.658747] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1373.670715] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1373.671011] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1373.671243] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1373.671416] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1373.672522] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9378682-d57f-44db-86e4-273371213dbf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1373.680966] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ce2f78f-1643-436e-b654-5705961bb3af {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1373.694812] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-315042f0-6194-4981-9046-2ef639c075a8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1373.700946] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16440b13-3b17-4fca-8b29-113ee1af5105 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1373.730251] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181030MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1373.730402] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1373.730591] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1373.803566] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.803737] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.803867] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.803989] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.804127] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.804246] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.804381] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.804488] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.804614] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.804781] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 74ea66f0-391c-437b-8aee-f784528d7963 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1373.816340] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 70ce9280-fb86-4e6a-a824-a174d44b4ec4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1373.828170] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1373.838877] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance cbf88ee7-b392-46d5-8645-2b3bea0a53d6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1373.849022] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1373.859461] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1373.859677] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1373.859825] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1374.032979] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c289981b-0c5e-49f9-ac58-dddc716a6529 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.040647] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4985827f-a361-44c0-beac-d28e395e6ecb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.069817] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e87f031-e41c-4927-8e90-5db10a826ae5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.076470] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b11faa4d-481a-44b4-95c9-86b24add67b4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.089568] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1374.097821] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1374.110438] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1374.110648] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.380s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1375.112681] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1375.112956] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1375.654474] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1375.658149] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1375.658308] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1375.658459] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1375.678731] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.678894] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.679035] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.679166] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.679289] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.679409] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.679539] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.679689] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.679815] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.679933] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1375.680066] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1375.680548] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1376.019012] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1376.019310] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1376.281752] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "48376572-9e3a-4579-b2d7-b8b63312fab1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1376.658706] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1377.658395] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1378.654063] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1378.680031] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1389.384560] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquiring lock "74ea66f0-391c-437b-8aee-f784528d7963" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1402.723555] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5b0ce0d8-354b-4a12-bc67-271d95a08bb6 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] Acquiring lock "5c675d7d-8915-4962-8bbd-c9b639ae2cb1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1402.723555] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5b0ce0d8-354b-4a12-bc67-271d95a08bb6 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] Lock "5c675d7d-8915-4962-8bbd-c9b639ae2cb1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1414.321390] env[67169]: WARNING oslo_vmware.rw_handles [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1414.321390] env[67169]: ERROR oslo_vmware.rw_handles [ 1414.322114] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/e006ccee-ffff-4ce7-8ccf-5d485938bc54/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1414.324104] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1414.324413] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Copying Virtual Disk [datastore2] vmware_temp/e006ccee-ffff-4ce7-8ccf-5d485938bc54/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/e006ccee-ffff-4ce7-8ccf-5d485938bc54/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1414.324708] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d56aa000-6041-406f-8bf6-77568e688324 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.332766] env[67169]: DEBUG oslo_vmware.api [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 1414.332766] env[67169]: value = "task-2819196" [ 1414.332766] env[67169]: _type = "Task" [ 1414.332766] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.340197] env[67169]: DEBUG oslo_vmware.api [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': task-2819196, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1414.843510] env[67169]: DEBUG oslo_vmware.exceptions [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1414.843796] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1414.844357] env[67169]: ERROR nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1414.844357] env[67169]: Faults: ['InvalidArgument'] [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Traceback (most recent call last): [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] yield resources [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] self.driver.spawn(context, instance, image_meta, [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] self._fetch_image_if_missing(context, vi) [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] image_cache(vi, tmp_image_ds_loc) [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] vm_util.copy_virtual_disk( [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] session._wait_for_task(vmdk_copy_task) [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] return self.wait_for_task(task_ref) [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] return evt.wait() [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] result = hub.switch() [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] return self.greenlet.switch() [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] self.f(*self.args, **self.kw) [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] raise exceptions.translate_fault(task_info.error) [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Faults: ['InvalidArgument'] [ 1414.844357] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] [ 1414.845308] env[67169]: INFO nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Terminating instance [ 1414.846318] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1414.846528] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1414.846761] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8f529b85-b07b-4e76-ac72-06e0ad5bf17d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.848891] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1414.849104] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1414.849806] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e2d1fb8-ab34-40fe-96d3-69b456bc1371 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.856380] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1414.856589] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0e1ca504-83bd-44a0-8441-8d367db23c78 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.858672] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1414.858834] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1414.859761] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c500488e-67f7-4d07-99ee-a8e6ec67945c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.864636] env[67169]: DEBUG oslo_vmware.api [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Waiting for the task: (returnval){ [ 1414.864636] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ce28eb-c70b-c6ad-2e0e-72273c78c19a" [ 1414.864636] env[67169]: _type = "Task" [ 1414.864636] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.878286] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1414.878518] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Creating directory with path [datastore2] vmware_temp/22bd3a03-8a46-40f8-9505-8cc8e4d23368/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1414.878723] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-31ea9abe-8b75-4bbf-add4-8cd3852e365e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.897571] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Created directory with path [datastore2] vmware_temp/22bd3a03-8a46-40f8-9505-8cc8e4d23368/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1414.897807] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Fetch image to [datastore2] vmware_temp/22bd3a03-8a46-40f8-9505-8cc8e4d23368/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1414.897987] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/22bd3a03-8a46-40f8-9505-8cc8e4d23368/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1414.898763] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2773f37-a74e-4140-ae92-edc96317bef0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.905789] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bbe92ae-395b-4c43-a393-14001ac15910 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.915389] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a0fe612-68b4-4165-abf1-5e987b452547 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.922864] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1414.923074] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1414.923278] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Deleting the datastore file [datastore2] ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1414.947831] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e3bf83ff-ab5b-4a9c-bcab-55422df5b070 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.950381] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-548cdd20-eab3-4385-b0f6-67ec26316876 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.956274] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9ae5703c-f348-46f8-a7bb-0e8c7c3eba1f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.958800] env[67169]: DEBUG oslo_vmware.api [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 1414.958800] env[67169]: value = "task-2819198" [ 1414.958800] env[67169]: _type = "Task" [ 1414.958800] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.966814] env[67169]: DEBUG oslo_vmware.api [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': task-2819198, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1414.984787] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1415.036819] env[67169]: DEBUG oslo_vmware.rw_handles [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/22bd3a03-8a46-40f8-9505-8cc8e4d23368/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1415.096072] env[67169]: DEBUG oslo_vmware.rw_handles [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1415.096072] env[67169]: DEBUG oslo_vmware.rw_handles [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/22bd3a03-8a46-40f8-9505-8cc8e4d23368/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1415.469236] env[67169]: DEBUG oslo_vmware.api [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': task-2819198, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068889} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1415.469550] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1415.469677] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1415.469846] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1415.470020] env[67169]: INFO nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1415.472053] env[67169]: DEBUG nova.compute.claims [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1415.472231] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1415.472440] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1415.697603] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ddd3666-6607-4bcc-9488-576f2309d7ca {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.705370] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7e72f89-38f5-42c1-a128-d256909302bb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.734423] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bece856-b805-4f79-a7ec-bf67f81da1dd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.741523] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d26b49eb-203a-4bf4-800d-a00494321833 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.754226] env[67169]: DEBUG nova.compute.provider_tree [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1415.762238] env[67169]: DEBUG nova.scheduler.client.report [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1415.778562] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.306s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1415.779129] env[67169]: ERROR nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1415.779129] env[67169]: Faults: ['InvalidArgument'] [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Traceback (most recent call last): [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] self.driver.spawn(context, instance, image_meta, [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] self._fetch_image_if_missing(context, vi) [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] image_cache(vi, tmp_image_ds_loc) [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] vm_util.copy_virtual_disk( [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] session._wait_for_task(vmdk_copy_task) [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] return self.wait_for_task(task_ref) [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] return evt.wait() [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] result = hub.switch() [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] return self.greenlet.switch() [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] self.f(*self.args, **self.kw) [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] raise exceptions.translate_fault(task_info.error) [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Faults: ['InvalidArgument'] [ 1415.779129] env[67169]: ERROR nova.compute.manager [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] [ 1415.779839] env[67169]: DEBUG nova.compute.utils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1415.781199] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Build of instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 was re-scheduled: A specified parameter was not correct: fileType [ 1415.781199] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1415.781573] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1415.781743] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1415.781909] env[67169]: DEBUG nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1415.782092] env[67169]: DEBUG nova.network.neutron [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1416.045228] env[67169]: DEBUG nova.network.neutron [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1416.060477] env[67169]: INFO nova.compute.manager [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Took 0.28 seconds to deallocate network for instance. [ 1416.163209] env[67169]: INFO nova.scheduler.client.report [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Deleted allocations for instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 [ 1416.184143] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ce2d0787-afc0-4f58-9a98-4b3cd03f7ccd tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.355s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.185474] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 434.531s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.185710] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1416.185958] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.186159] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.188232] env[67169]: INFO nova.compute.manager [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Terminating instance [ 1416.190211] env[67169]: DEBUG nova.compute.manager [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1416.190408] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1416.190670] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-46a8a40a-b519-48cc-81b2-b6b662e6c4ec {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.198476] env[67169]: DEBUG nova.compute.manager [None req-9969fe74-4992-49fc-a69d-23e6586f88e7 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] [instance: 70ce9280-fb86-4e6a-a824-a174d44b4ec4] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1416.204212] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ef76fcf-33a3-48fc-b5bd-a588a4473c47 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.222617] env[67169]: DEBUG nova.compute.manager [None req-9969fe74-4992-49fc-a69d-23e6586f88e7 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] [instance: 70ce9280-fb86-4e6a-a824-a174d44b4ec4] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1416.235368] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ceec0dd3-097b-4ab4-8e16-420d40bbe3d5 could not be found. [ 1416.235578] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1416.235764] env[67169]: INFO nova.compute.manager [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1416.236016] env[67169]: DEBUG oslo.service.loopingcall [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1416.236305] env[67169]: DEBUG nova.compute.manager [-] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1416.236409] env[67169]: DEBUG nova.network.neutron [-] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1416.252431] env[67169]: DEBUG oslo_concurrency.lockutils [None req-9969fe74-4992-49fc-a69d-23e6586f88e7 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] Lock "70ce9280-fb86-4e6a-a824-a174d44b4ec4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.815s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.262250] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1416.274206] env[67169]: DEBUG nova.network.neutron [-] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1416.283063] env[67169]: INFO nova.compute.manager [-] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] Took 0.05 seconds to deallocate network for instance. [ 1416.315426] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1416.315697] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.317156] env[67169]: INFO nova.compute.claims [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1416.381015] env[67169]: DEBUG oslo_concurrency.lockutils [None req-6fed50b9-66de-4f8f-885d-5a98660d29fa tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.196s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.381863] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 230.681s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.382181] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: ceec0dd3-097b-4ab4-8e16-420d40bbe3d5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1416.382266] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "ceec0dd3-097b-4ab4-8e16-420d40bbe3d5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.526651] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70534e3a-35e7-41c6-8640-8cbcd59767dd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.534377] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfd10553-e249-4c20-8d86-6e443c447f47 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.563692] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb53b125-43b8-4cb5-bde8-030f2b520fcf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.571203] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71bb2afd-733e-4d7a-bd98-73547df87bc8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.584207] env[67169]: DEBUG nova.compute.provider_tree [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1416.592906] env[67169]: DEBUG nova.scheduler.client.report [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1416.611071] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.611590] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1416.644081] env[67169]: DEBUG nova.compute.utils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1416.645653] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1416.645847] env[67169]: DEBUG nova.network.neutron [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1416.655183] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1416.718956] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1416.740686] env[67169]: DEBUG nova.policy [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f922f7300dcc40c9bdb933bceedfd2ae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0938eff566ed4ab5afe3dfed02b60aaf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1416.743999] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1416.744242] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1416.744403] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1416.744597] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1416.744742] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1416.744889] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1416.745153] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1416.745335] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1416.745505] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1416.745667] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1416.745838] env[67169]: DEBUG nova.virt.hardware [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1416.746949] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bed8a0f-1a7b-46bd-bed3-44eca61939c6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.754748] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fad456d-8f21-44a8-a5dd-299931f1f47b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1417.038268] env[67169]: DEBUG nova.network.neutron [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Successfully created port: 9744c413-42bd-4a9b-861c-8d609731c128 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1417.437280] env[67169]: DEBUG nova.network.neutron [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Successfully created port: d2547805-a5c2-4e48-8694-1868f063a9b0 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1418.627921] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1418.628764] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1418.670279] env[67169]: DEBUG nova.network.neutron [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Successfully updated port: 9744c413-42bd-4a9b-861c-8d609731c128 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1418.777955] env[67169]: DEBUG nova.compute.manager [req-39176a14-1535-4443-a19e-b02214fdef8e req-c319cf7a-fdff-47ac-810a-46486fb480a3 service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Received event network-vif-plugged-9744c413-42bd-4a9b-861c-8d609731c128 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1418.778180] env[67169]: DEBUG oslo_concurrency.lockutils [req-39176a14-1535-4443-a19e-b02214fdef8e req-c319cf7a-fdff-47ac-810a-46486fb480a3 service nova] Acquiring lock "7b7c8f84-c2d4-442e-93d3-60124767d096-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1418.778560] env[67169]: DEBUG oslo_concurrency.lockutils [req-39176a14-1535-4443-a19e-b02214fdef8e req-c319cf7a-fdff-47ac-810a-46486fb480a3 service nova] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1418.778836] env[67169]: DEBUG oslo_concurrency.lockutils [req-39176a14-1535-4443-a19e-b02214fdef8e req-c319cf7a-fdff-47ac-810a-46486fb480a3 service nova] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1418.778836] env[67169]: DEBUG nova.compute.manager [req-39176a14-1535-4443-a19e-b02214fdef8e req-c319cf7a-fdff-47ac-810a-46486fb480a3 service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] No waiting events found dispatching network-vif-plugged-9744c413-42bd-4a9b-861c-8d609731c128 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1418.778978] env[67169]: WARNING nova.compute.manager [req-39176a14-1535-4443-a19e-b02214fdef8e req-c319cf7a-fdff-47ac-810a-46486fb480a3 service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Received unexpected event network-vif-plugged-9744c413-42bd-4a9b-861c-8d609731c128 for instance with vm_state building and task_state spawning. [ 1419.322362] env[67169]: DEBUG nova.network.neutron [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Successfully updated port: d2547805-a5c2-4e48-8694-1868f063a9b0 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1419.334678] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquiring lock "refresh_cache-7b7c8f84-c2d4-442e-93d3-60124767d096" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1419.334829] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquired lock "refresh_cache-7b7c8f84-c2d4-442e-93d3-60124767d096" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1419.334978] env[67169]: DEBUG nova.network.neutron [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1419.373517] env[67169]: DEBUG nova.network.neutron [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1420.039793] env[67169]: DEBUG nova.network.neutron [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Updating instance_info_cache with network_info: [{"id": "9744c413-42bd-4a9b-861c-8d609731c128", "address": "fa:16:3e:7e:45:86", "network": {"id": "d0bd736a-62c4-4669-b3ca-98c1a02a4e69", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-792324366", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0938eff566ed4ab5afe3dfed02b60aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6e2a9679-6746-40f2-951c-65fcd1af5f7b", "external-id": "nsx-vlan-transportzone-39", "segmentation_id": 39, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9744c413-42", "ovs_interfaceid": "9744c413-42bd-4a9b-861c-8d609731c128", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2547805-a5c2-4e48-8694-1868f063a9b0", "address": "fa:16:3e:22:44:2c", "network": {"id": "67a07957-ba78-4d53-ac90-e3dfc49ac93f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-217325278", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "0938eff566ed4ab5afe3dfed02b60aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f256cfee-512d-4192-9aca-6750fdb1cd4c", "external-id": "nsx-vlan-transportzone-821", "segmentation_id": 821, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2547805-a5", "ovs_interfaceid": "d2547805-a5c2-4e48-8694-1868f063a9b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1420.053818] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Releasing lock "refresh_cache-7b7c8f84-c2d4-442e-93d3-60124767d096" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1420.054204] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Instance network_info: |[{"id": "9744c413-42bd-4a9b-861c-8d609731c128", "address": "fa:16:3e:7e:45:86", "network": {"id": "d0bd736a-62c4-4669-b3ca-98c1a02a4e69", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-792324366", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0938eff566ed4ab5afe3dfed02b60aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6e2a9679-6746-40f2-951c-65fcd1af5f7b", "external-id": "nsx-vlan-transportzone-39", "segmentation_id": 39, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9744c413-42", "ovs_interfaceid": "9744c413-42bd-4a9b-861c-8d609731c128", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2547805-a5c2-4e48-8694-1868f063a9b0", "address": "fa:16:3e:22:44:2c", "network": {"id": "67a07957-ba78-4d53-ac90-e3dfc49ac93f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-217325278", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "0938eff566ed4ab5afe3dfed02b60aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f256cfee-512d-4192-9aca-6750fdb1cd4c", "external-id": "nsx-vlan-transportzone-821", "segmentation_id": 821, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2547805-a5", "ovs_interfaceid": "d2547805-a5c2-4e48-8694-1868f063a9b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1420.054601] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7e:45:86', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6e2a9679-6746-40f2-951c-65fcd1af5f7b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9744c413-42bd-4a9b-861c-8d609731c128', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:22:44:2c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f256cfee-512d-4192-9aca-6750fdb1cd4c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd2547805-a5c2-4e48-8694-1868f063a9b0', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1420.076960] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Creating folder: Project (0938eff566ed4ab5afe3dfed02b60aaf). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1420.076960] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-60009f7f-48c6-45ee-8181-66ff5de89350 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.086217] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Created folder: Project (0938eff566ed4ab5afe3dfed02b60aaf) in parent group-v566843. [ 1420.086412] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Creating folder: Instances. Parent ref: group-v566928. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1420.086675] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-076ca628-ed70-4c0c-8dd4-c1d6277bceef {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.096628] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Created folder: Instances in parent group-v566928. [ 1420.096868] env[67169]: DEBUG oslo.service.loopingcall [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1420.097079] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1420.097286] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3995fbb4-ce7a-4754-a240-0e5beadcb555 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.120406] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1420.120406] env[67169]: value = "task-2819201" [ 1420.120406] env[67169]: _type = "Task" [ 1420.120406] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1420.129230] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819201, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1420.630787] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819201, 'name': CreateVM_Task, 'duration_secs': 0.34171} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1420.631102] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1420.631751] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1420.632045] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1420.632260] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1420.632517] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e88b7709-c93a-44cf-9717-d67cacbe22e1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.637240] env[67169]: DEBUG oslo_vmware.api [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Waiting for the task: (returnval){ [ 1420.637240] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5210751a-0ac5-def8-6805-ef40846d7140" [ 1420.637240] env[67169]: _type = "Task" [ 1420.637240] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1420.645331] env[67169]: DEBUG oslo_vmware.api [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5210751a-0ac5-def8-6805-ef40846d7140, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1420.803462] env[67169]: DEBUG nova.compute.manager [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Received event network-changed-9744c413-42bd-4a9b-861c-8d609731c128 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1420.803647] env[67169]: DEBUG nova.compute.manager [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Refreshing instance network info cache due to event network-changed-9744c413-42bd-4a9b-861c-8d609731c128. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1420.803871] env[67169]: DEBUG oslo_concurrency.lockutils [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] Acquiring lock "refresh_cache-7b7c8f84-c2d4-442e-93d3-60124767d096" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1420.804025] env[67169]: DEBUG oslo_concurrency.lockutils [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] Acquired lock "refresh_cache-7b7c8f84-c2d4-442e-93d3-60124767d096" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1420.804197] env[67169]: DEBUG nova.network.neutron [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Refreshing network info cache for port 9744c413-42bd-4a9b-861c-8d609731c128 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1421.101432] env[67169]: DEBUG nova.network.neutron [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Updated VIF entry in instance network info cache for port 9744c413-42bd-4a9b-861c-8d609731c128. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1421.101860] env[67169]: DEBUG nova.network.neutron [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Updating instance_info_cache with network_info: [{"id": "9744c413-42bd-4a9b-861c-8d609731c128", "address": "fa:16:3e:7e:45:86", "network": {"id": "d0bd736a-62c4-4669-b3ca-98c1a02a4e69", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-792324366", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0938eff566ed4ab5afe3dfed02b60aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6e2a9679-6746-40f2-951c-65fcd1af5f7b", "external-id": "nsx-vlan-transportzone-39", "segmentation_id": 39, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9744c413-42", "ovs_interfaceid": "9744c413-42bd-4a9b-861c-8d609731c128", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2547805-a5c2-4e48-8694-1868f063a9b0", "address": "fa:16:3e:22:44:2c", "network": {"id": "67a07957-ba78-4d53-ac90-e3dfc49ac93f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-217325278", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "0938eff566ed4ab5afe3dfed02b60aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f256cfee-512d-4192-9aca-6750fdb1cd4c", "external-id": "nsx-vlan-transportzone-821", "segmentation_id": 821, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2547805-a5", "ovs_interfaceid": "d2547805-a5c2-4e48-8694-1868f063a9b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1421.111584] env[67169]: DEBUG oslo_concurrency.lockutils [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] Releasing lock "refresh_cache-7b7c8f84-c2d4-442e-93d3-60124767d096" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1421.111913] env[67169]: DEBUG nova.compute.manager [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Received event network-vif-plugged-d2547805-a5c2-4e48-8694-1868f063a9b0 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1421.112214] env[67169]: DEBUG oslo_concurrency.lockutils [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] Acquiring lock "7b7c8f84-c2d4-442e-93d3-60124767d096-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1421.112506] env[67169]: DEBUG oslo_concurrency.lockutils [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1421.112754] env[67169]: DEBUG oslo_concurrency.lockutils [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1421.113179] env[67169]: DEBUG nova.compute.manager [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] No waiting events found dispatching network-vif-plugged-d2547805-a5c2-4e48-8694-1868f063a9b0 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1421.113473] env[67169]: WARNING nova.compute.manager [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Received unexpected event network-vif-plugged-d2547805-a5c2-4e48-8694-1868f063a9b0 for instance with vm_state building and task_state spawning. [ 1421.113940] env[67169]: DEBUG nova.compute.manager [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Received event network-changed-d2547805-a5c2-4e48-8694-1868f063a9b0 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1421.114168] env[67169]: DEBUG nova.compute.manager [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Refreshing instance network info cache due to event network-changed-d2547805-a5c2-4e48-8694-1868f063a9b0. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1421.114382] env[67169]: DEBUG oslo_concurrency.lockutils [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] Acquiring lock "refresh_cache-7b7c8f84-c2d4-442e-93d3-60124767d096" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1421.114512] env[67169]: DEBUG oslo_concurrency.lockutils [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] Acquired lock "refresh_cache-7b7c8f84-c2d4-442e-93d3-60124767d096" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1421.114719] env[67169]: DEBUG nova.network.neutron [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Refreshing network info cache for port d2547805-a5c2-4e48-8694-1868f063a9b0 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1421.147103] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1421.147347] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1421.147564] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1421.468015] env[67169]: DEBUG nova.network.neutron [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Updated VIF entry in instance network info cache for port d2547805-a5c2-4e48-8694-1868f063a9b0. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1421.468459] env[67169]: DEBUG nova.network.neutron [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Updating instance_info_cache with network_info: [{"id": "9744c413-42bd-4a9b-861c-8d609731c128", "address": "fa:16:3e:7e:45:86", "network": {"id": "d0bd736a-62c4-4669-b3ca-98c1a02a4e69", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-792324366", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0938eff566ed4ab5afe3dfed02b60aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6e2a9679-6746-40f2-951c-65fcd1af5f7b", "external-id": "nsx-vlan-transportzone-39", "segmentation_id": 39, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9744c413-42", "ovs_interfaceid": "9744c413-42bd-4a9b-861c-8d609731c128", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d2547805-a5c2-4e48-8694-1868f063a9b0", "address": "fa:16:3e:22:44:2c", "network": {"id": "67a07957-ba78-4d53-ac90-e3dfc49ac93f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-217325278", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "0938eff566ed4ab5afe3dfed02b60aaf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f256cfee-512d-4192-9aca-6750fdb1cd4c", "external-id": "nsx-vlan-transportzone-821", "segmentation_id": 821, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2547805-a5", "ovs_interfaceid": "d2547805-a5c2-4e48-8694-1868f063a9b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1421.477950] env[67169]: DEBUG oslo_concurrency.lockutils [req-8ae05ea6-08f9-43be-b3fe-9ae57ce03d62 req-9e235f69-94af-47b5-8f7f-06d60f2ed1fc service nova] Releasing lock "refresh_cache-7b7c8f84-c2d4-442e-93d3-60124767d096" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1429.659153] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1435.659780] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1435.660068] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1435.660141] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1435.681189] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.681344] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.681474] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.681729] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.681926] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.682068] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.682197] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.682321] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.682445] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.682564] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1435.682681] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1435.683196] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1435.683342] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1435.683494] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1435.693178] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1435.693408] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1435.693580] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1435.693687] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1435.694738] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81894d21-f29d-48fc-822f-77c30d7b9f36 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.703627] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63d99e74-2159-48c7-8c56-a88f92461687 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.717450] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-157cf297-816d-438e-bd91-1e95a028f971 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.723576] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a73b02f-b717-441b-8d66-207f212ec2f0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.752507] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181010MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1435.752660] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1435.752848] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1435.820078] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.820290] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.820430] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.820553] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.820670] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.820785] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.820899] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.821028] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.821182] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 74ea66f0-391c-437b-8aee-f784528d7963 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.821301] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1435.831911] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance cbf88ee7-b392-46d5-8645-2b3bea0a53d6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1435.842162] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1435.851464] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1435.860981] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1435.870283] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 5c675d7d-8915-4962-8bbd-c9b639ae2cb1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1435.879363] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1435.879583] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1435.879728] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1436.048491] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76c920a4-c7dc-4153-8925-1ed31cd8585b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.055978] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a5ad435-afa8-43b3-bd63-9bd1ff0ab072 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.084771] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac9d160b-bc65-48fa-9f29-caed891ab5bb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.091297] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1005caf-9b44-4ac5-bbbe-c0bfa169509d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.103774] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1436.111972] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1436.126558] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1436.126744] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.374s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1437.103407] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1437.654324] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1437.658993] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1437.659269] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1438.666641] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1438.666919] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1438.667060] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1438.678473] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] There are 0 instances to clean {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1439.670963] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1446.659754] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1446.660182] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances with incomplete migration {{(pid=67169) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1448.522867] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquiring lock "7b7c8f84-c2d4-442e-93d3-60124767d096" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1451.218657] env[67169]: DEBUG oslo_concurrency.lockutils [None req-63dec5b3-3efd-41a9-be63-b4fd3bdaef13 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] Acquiring lock "4e978c21-ae48-422e-9126-a4144c86b86f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1451.218983] env[67169]: DEBUG oslo_concurrency.lockutils [None req-63dec5b3-3efd-41a9-be63-b4fd3bdaef13 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] Lock "4e978c21-ae48-422e-9126-a4144c86b86f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1461.902217] env[67169]: WARNING oslo_vmware.rw_handles [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1461.902217] env[67169]: ERROR oslo_vmware.rw_handles [ 1461.902807] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/22bd3a03-8a46-40f8-9505-8cc8e4d23368/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1461.904513] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1461.904754] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Copying Virtual Disk [datastore2] vmware_temp/22bd3a03-8a46-40f8-9505-8cc8e4d23368/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/22bd3a03-8a46-40f8-9505-8cc8e4d23368/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1461.905054] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8e6aed65-78f5-4c2f-b20c-f25881578265 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.912625] env[67169]: DEBUG oslo_vmware.api [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Waiting for the task: (returnval){ [ 1461.912625] env[67169]: value = "task-2819202" [ 1461.912625] env[67169]: _type = "Task" [ 1461.912625] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1461.920278] env[67169]: DEBUG oslo_vmware.api [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Task: {'id': task-2819202, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1462.423510] env[67169]: DEBUG oslo_vmware.exceptions [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1462.423865] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1462.424453] env[67169]: ERROR nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1462.424453] env[67169]: Faults: ['InvalidArgument'] [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Traceback (most recent call last): [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] yield resources [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] self.driver.spawn(context, instance, image_meta, [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] self._fetch_image_if_missing(context, vi) [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] image_cache(vi, tmp_image_ds_loc) [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] vm_util.copy_virtual_disk( [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] session._wait_for_task(vmdk_copy_task) [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] return self.wait_for_task(task_ref) [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] return evt.wait() [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] result = hub.switch() [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] return self.greenlet.switch() [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] self.f(*self.args, **self.kw) [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] raise exceptions.translate_fault(task_info.error) [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Faults: ['InvalidArgument'] [ 1462.424453] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] [ 1462.425387] env[67169]: INFO nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Terminating instance [ 1462.426324] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1462.426610] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1462.426793] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a1ace0df-98cb-4361-bd05-388f4a107a14 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.428745] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "refresh_cache-c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1462.428905] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquired lock "refresh_cache-c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1462.429082] env[67169]: DEBUG nova.network.neutron [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1462.436836] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1462.437015] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1462.438178] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f6449e70-2340-466f-a89e-86ae16140947 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.445250] env[67169]: DEBUG oslo_vmware.api [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 1462.445250] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52615b48-7e73-0bed-51e0-9ab8a8c76d6c" [ 1462.445250] env[67169]: _type = "Task" [ 1462.445250] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1462.452442] env[67169]: DEBUG oslo_vmware.api [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52615b48-7e73-0bed-51e0-9ab8a8c76d6c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1462.458412] env[67169]: DEBUG nova.network.neutron [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1462.517396] env[67169]: DEBUG nova.network.neutron [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1462.526418] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Releasing lock "refresh_cache-c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1462.526826] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1462.527030] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1462.528045] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3e2a569-747f-44f8-a1ba-349fa0e3b64c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.536334] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1462.536601] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-353b6fb4-7fa4-4c9d-9495-3910199e9418 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.563599] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1462.563808] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1462.563984] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Deleting the datastore file [datastore2] c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1462.564222] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0638c178-e7a7-457f-870b-99414a25ca2f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.569976] env[67169]: DEBUG oslo_vmware.api [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Waiting for the task: (returnval){ [ 1462.569976] env[67169]: value = "task-2819204" [ 1462.569976] env[67169]: _type = "Task" [ 1462.569976] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1462.578452] env[67169]: DEBUG oslo_vmware.api [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Task: {'id': task-2819204, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1462.955767] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1462.956055] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating directory with path [datastore2] vmware_temp/cce07b6f-a01f-46a3-b6b4-3f95fc0508a3/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1462.956300] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-55a59043-04cf-43dc-93b3-ac47828961bb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.968022] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Created directory with path [datastore2] vmware_temp/cce07b6f-a01f-46a3-b6b4-3f95fc0508a3/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1462.968228] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Fetch image to [datastore2] vmware_temp/cce07b6f-a01f-46a3-b6b4-3f95fc0508a3/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1462.968396] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/cce07b6f-a01f-46a3-b6b4-3f95fc0508a3/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1462.969155] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf2282d5-a2b0-41be-ad2d-dc0b1afeda26 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.975956] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c3ad2e3-0a8d-4b97-b687-0f6e07c1ddf1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.984778] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5953d561-4426-4875-bdf6-d8bf90c09161 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.015416] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1cadc2f-c474-4e81-a53d-4a1b8224339b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.021488] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-108aa1a4-3d90-4259-b56e-8cc3be307cc0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.041938] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1463.078148] env[67169]: DEBUG oslo_vmware.api [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Task: {'id': task-2819204, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.043731} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1463.078406] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1463.078587] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1463.078759] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1463.078930] env[67169]: INFO nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Took 0.55 seconds to destroy the instance on the hypervisor. [ 1463.079182] env[67169]: DEBUG oslo.service.loopingcall [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1463.079388] env[67169]: DEBUG nova.compute.manager [-] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Skipping network deallocation for instance since networking was not requested. {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1463.083139] env[67169]: DEBUG nova.compute.claims [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1463.083139] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1463.083139] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1463.094930] env[67169]: DEBUG oslo_vmware.rw_handles [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cce07b6f-a01f-46a3-b6b4-3f95fc0508a3/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1463.154203] env[67169]: DEBUG oslo_vmware.rw_handles [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1463.154366] env[67169]: DEBUG oslo_vmware.rw_handles [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cce07b6f-a01f-46a3-b6b4-3f95fc0508a3/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1463.346450] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcc9f852-9bdb-40a6-93b2-9aecdc3c349c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.354197] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-014d686c-4237-443b-994c-75e65abda341 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.384628] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d3aec84-d2d9-4571-b122-33b50fc7c7cd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.391654] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecf307b8-391c-4802-9fe7-2ecedbd55e93 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.404759] env[67169]: DEBUG nova.compute.provider_tree [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1463.413409] env[67169]: DEBUG nova.scheduler.client.report [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1463.427209] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.344s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1463.427757] env[67169]: ERROR nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1463.427757] env[67169]: Faults: ['InvalidArgument'] [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Traceback (most recent call last): [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] self.driver.spawn(context, instance, image_meta, [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] self._fetch_image_if_missing(context, vi) [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] image_cache(vi, tmp_image_ds_loc) [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] vm_util.copy_virtual_disk( [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] session._wait_for_task(vmdk_copy_task) [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] return self.wait_for_task(task_ref) [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] return evt.wait() [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] result = hub.switch() [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] return self.greenlet.switch() [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] self.f(*self.args, **self.kw) [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] raise exceptions.translate_fault(task_info.error) [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Faults: ['InvalidArgument'] [ 1463.427757] env[67169]: ERROR nova.compute.manager [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] [ 1463.428536] env[67169]: DEBUG nova.compute.utils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1463.429901] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Build of instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 was re-scheduled: A specified parameter was not correct: fileType [ 1463.429901] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1463.430307] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1463.430535] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "refresh_cache-c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1463.430685] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquired lock "refresh_cache-c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1463.430846] env[67169]: DEBUG nova.network.neutron [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1463.456436] env[67169]: DEBUG nova.network.neutron [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1463.525402] env[67169]: DEBUG nova.network.neutron [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1463.534145] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Releasing lock "refresh_cache-c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1463.534388] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1463.534576] env[67169]: DEBUG nova.compute.manager [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Skipping network deallocation for instance since networking was not requested. {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1463.614201] env[67169]: INFO nova.scheduler.client.report [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Deleted allocations for instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 [ 1463.634679] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d696ce3a-62d6-4cfe-ab03-acc4f9294f1b tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 586.943s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1463.635886] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 392.168s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1463.636113] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1463.636318] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1463.636483] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1463.638383] env[67169]: INFO nova.compute.manager [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Terminating instance [ 1463.639825] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquiring lock "refresh_cache-c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1463.639981] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Acquired lock "refresh_cache-c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1463.640156] env[67169]: DEBUG nova.network.neutron [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1463.648118] env[67169]: DEBUG nova.compute.manager [None req-f38336b9-8a52-414c-9609-3b46d9804727 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] [instance: cbf88ee7-b392-46d5-8645-2b3bea0a53d6] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1463.665158] env[67169]: DEBUG nova.network.neutron [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1463.675471] env[67169]: DEBUG nova.compute.manager [None req-f38336b9-8a52-414c-9609-3b46d9804727 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] [instance: cbf88ee7-b392-46d5-8645-2b3bea0a53d6] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1463.694445] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f38336b9-8a52-414c-9609-3b46d9804727 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] Lock "cbf88ee7-b392-46d5-8645-2b3bea0a53d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.512s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1463.702734] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1463.724743] env[67169]: DEBUG nova.network.neutron [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1463.735797] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Releasing lock "refresh_cache-c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1463.736176] env[67169]: DEBUG nova.compute.manager [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1463.736362] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1463.738723] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-27071b42-a76d-4a0e-8820-677f1bbf5112 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.747860] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dcbd74f-8b4e-4e4e-8e03-570a68b34516 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.758354] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1463.758569] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1463.759920] env[67169]: INFO nova.compute.claims [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1463.779476] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c86c3850-39bb-4a08-8dbf-f69bd8ca21c9 could not be found. [ 1463.779656] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1463.779828] env[67169]: INFO nova.compute.manager [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1463.780066] env[67169]: DEBUG oslo.service.loopingcall [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1463.780285] env[67169]: DEBUG nova.compute.manager [-] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1463.780379] env[67169]: DEBUG nova.network.neutron [-] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1463.797364] env[67169]: DEBUG nova.network.neutron [-] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1463.805514] env[67169]: DEBUG nova.network.neutron [-] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1463.812337] env[67169]: INFO nova.compute.manager [-] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] Took 0.03 seconds to deallocate network for instance. [ 1463.904518] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a5fd5d9-ec2e-42b6-b492-4d9ab4d97914 tempest-ServerShowV257Test-257580876 tempest-ServerShowV257Test-257580876-project-member] Lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.269s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1463.906371] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 278.205s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1463.906371] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c86c3850-39bb-4a08-8dbf-f69bd8ca21c9] During sync_power_state the instance has a pending task (deleting). Skip. [ 1463.906501] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "c86c3850-39bb-4a08-8dbf-f69bd8ca21c9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1463.983433] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e894f77-4d20-4194-858f-9390a16bff3d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.991017] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e3a8665-ce5e-4d6f-b2fd-2ac9b13c248f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.022917] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d581f14a-118f-4a94-a11f-e1d24ba031a7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.029952] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80b0b80c-1094-4264-8a89-218d6ab464b6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.042762] env[67169]: DEBUG nova.compute.provider_tree [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1464.052045] env[67169]: DEBUG nova.scheduler.client.report [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1464.067454] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1464.068659] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1464.102033] env[67169]: DEBUG nova.compute.utils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1464.103731] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1464.104052] env[67169]: DEBUG nova.network.neutron [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1464.113258] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1464.183805] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1464.208159] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1464.208421] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1464.208584] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1464.208754] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1464.208893] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1464.209044] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1464.209252] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1464.209712] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1464.209712] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1464.209712] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1464.209865] env[67169]: DEBUG nova.virt.hardware [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1464.210770] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-207ea648-d3a6-44ca-9221-5bace68c7b81 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.214623] env[67169]: DEBUG nova.policy [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '47c8144396cb48b7ab29a514e0d30168', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '50680419dd83401898f1b0ef7e36edd0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1464.222470] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8da7f561-54f3-4b0a-a55e-bd3d77c4f4a0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.733148] env[67169]: DEBUG nova.network.neutron [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Successfully created port: b0de8f88-74eb-44c7-bb52-79dda2a16a29 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1465.444244] env[67169]: DEBUG nova.compute.manager [req-90372b91-11d9-4d13-8eba-c7adcbc5d262 req-c1b83998-82c1-4d6e-94b5-3bbdb03f26f2 service nova] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Received event network-vif-plugged-b0de8f88-74eb-44c7-bb52-79dda2a16a29 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1465.444244] env[67169]: DEBUG oslo_concurrency.lockutils [req-90372b91-11d9-4d13-8eba-c7adcbc5d262 req-c1b83998-82c1-4d6e-94b5-3bbdb03f26f2 service nova] Acquiring lock "2e156908-c313-4229-840d-13ed8e6d4074-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1465.444244] env[67169]: DEBUG oslo_concurrency.lockutils [req-90372b91-11d9-4d13-8eba-c7adcbc5d262 req-c1b83998-82c1-4d6e-94b5-3bbdb03f26f2 service nova] Lock "2e156908-c313-4229-840d-13ed8e6d4074-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1465.444244] env[67169]: DEBUG oslo_concurrency.lockutils [req-90372b91-11d9-4d13-8eba-c7adcbc5d262 req-c1b83998-82c1-4d6e-94b5-3bbdb03f26f2 service nova] Lock "2e156908-c313-4229-840d-13ed8e6d4074-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.444244] env[67169]: DEBUG nova.compute.manager [req-90372b91-11d9-4d13-8eba-c7adcbc5d262 req-c1b83998-82c1-4d6e-94b5-3bbdb03f26f2 service nova] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] No waiting events found dispatching network-vif-plugged-b0de8f88-74eb-44c7-bb52-79dda2a16a29 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1465.444244] env[67169]: WARNING nova.compute.manager [req-90372b91-11d9-4d13-8eba-c7adcbc5d262 req-c1b83998-82c1-4d6e-94b5-3bbdb03f26f2 service nova] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Received unexpected event network-vif-plugged-b0de8f88-74eb-44c7-bb52-79dda2a16a29 for instance with vm_state building and task_state spawning. [ 1465.573140] env[67169]: DEBUG nova.network.neutron [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Successfully updated port: b0de8f88-74eb-44c7-bb52-79dda2a16a29 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1465.589509] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquiring lock "refresh_cache-2e156908-c313-4229-840d-13ed8e6d4074" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1465.589509] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquired lock "refresh_cache-2e156908-c313-4229-840d-13ed8e6d4074" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1465.589509] env[67169]: DEBUG nova.network.neutron [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1465.656266] env[67169]: DEBUG nova.network.neutron [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1465.825778] env[67169]: DEBUG nova.network.neutron [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Updating instance_info_cache with network_info: [{"id": "b0de8f88-74eb-44c7-bb52-79dda2a16a29", "address": "fa:16:3e:6c:9e:fb", "network": {"id": "92c79b42-01f4-473a-8990-6f7cfb917f3b", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-64575008-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "50680419dd83401898f1b0ef7e36edd0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "205fb402-8eaf-4b61-8f57-8f216024179a", "external-id": "nsx-vlan-transportzone-78", "segmentation_id": 78, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0de8f88-74", "ovs_interfaceid": "b0de8f88-74eb-44c7-bb52-79dda2a16a29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1465.837176] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Releasing lock "refresh_cache-2e156908-c313-4229-840d-13ed8e6d4074" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1465.837593] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Instance network_info: |[{"id": "b0de8f88-74eb-44c7-bb52-79dda2a16a29", "address": "fa:16:3e:6c:9e:fb", "network": {"id": "92c79b42-01f4-473a-8990-6f7cfb917f3b", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-64575008-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "50680419dd83401898f1b0ef7e36edd0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "205fb402-8eaf-4b61-8f57-8f216024179a", "external-id": "nsx-vlan-transportzone-78", "segmentation_id": 78, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0de8f88-74", "ovs_interfaceid": "b0de8f88-74eb-44c7-bb52-79dda2a16a29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1465.838421] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6c:9e:fb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '205fb402-8eaf-4b61-8f57-8f216024179a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b0de8f88-74eb-44c7-bb52-79dda2a16a29', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1465.846089] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Creating folder: Project (50680419dd83401898f1b0ef7e36edd0). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1465.846618] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a33c842c-b8b7-4a74-ab19-bcbb44d5bb78 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.857411] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Created folder: Project (50680419dd83401898f1b0ef7e36edd0) in parent group-v566843. [ 1465.857615] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Creating folder: Instances. Parent ref: group-v566931. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1465.857812] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-35c1fcb6-14f6-4b54-9a3d-2d62642054b5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.866422] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Created folder: Instances in parent group-v566931. [ 1465.866635] env[67169]: DEBUG oslo.service.loopingcall [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1465.866818] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1465.867028] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-729d2da0-9147-4991-a479-b150d6cef9b6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.885797] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1465.885797] env[67169]: value = "task-2819207" [ 1465.885797] env[67169]: _type = "Task" [ 1465.885797] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1465.894149] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819207, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1466.395462] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819207, 'name': CreateVM_Task, 'duration_secs': 0.369861} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1466.399027] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1466.399027] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1466.399027] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1466.399027] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1466.399027] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cb8940e2-57a6-4612-b4eb-66948c9119ba {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1466.402302] env[67169]: DEBUG oslo_vmware.api [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Waiting for the task: (returnval){ [ 1466.402302] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52e4dcbb-0db3-85e7-f7f4-0f2bb2177680" [ 1466.402302] env[67169]: _type = "Task" [ 1466.402302] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1466.410350] env[67169]: DEBUG oslo_vmware.api [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52e4dcbb-0db3-85e7-f7f4-0f2bb2177680, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1466.912965] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1466.913355] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1466.913617] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1467.473827] env[67169]: DEBUG nova.compute.manager [req-d626e06a-5015-44b9-8f0e-3fbe6675d496 req-42182837-63be-4b1c-a691-032f699bf334 service nova] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Received event network-changed-b0de8f88-74eb-44c7-bb52-79dda2a16a29 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1467.474056] env[67169]: DEBUG nova.compute.manager [req-d626e06a-5015-44b9-8f0e-3fbe6675d496 req-42182837-63be-4b1c-a691-032f699bf334 service nova] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Refreshing instance network info cache due to event network-changed-b0de8f88-74eb-44c7-bb52-79dda2a16a29. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1467.474269] env[67169]: DEBUG oslo_concurrency.lockutils [req-d626e06a-5015-44b9-8f0e-3fbe6675d496 req-42182837-63be-4b1c-a691-032f699bf334 service nova] Acquiring lock "refresh_cache-2e156908-c313-4229-840d-13ed8e6d4074" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1467.474412] env[67169]: DEBUG oslo_concurrency.lockutils [req-d626e06a-5015-44b9-8f0e-3fbe6675d496 req-42182837-63be-4b1c-a691-032f699bf334 service nova] Acquired lock "refresh_cache-2e156908-c313-4229-840d-13ed8e6d4074" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1467.474571] env[67169]: DEBUG nova.network.neutron [req-d626e06a-5015-44b9-8f0e-3fbe6675d496 req-42182837-63be-4b1c-a691-032f699bf334 service nova] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Refreshing network info cache for port b0de8f88-74eb-44c7-bb52-79dda2a16a29 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1467.800617] env[67169]: DEBUG nova.network.neutron [req-d626e06a-5015-44b9-8f0e-3fbe6675d496 req-42182837-63be-4b1c-a691-032f699bf334 service nova] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Updated VIF entry in instance network info cache for port b0de8f88-74eb-44c7-bb52-79dda2a16a29. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1467.800992] env[67169]: DEBUG nova.network.neutron [req-d626e06a-5015-44b9-8f0e-3fbe6675d496 req-42182837-63be-4b1c-a691-032f699bf334 service nova] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Updating instance_info_cache with network_info: [{"id": "b0de8f88-74eb-44c7-bb52-79dda2a16a29", "address": "fa:16:3e:6c:9e:fb", "network": {"id": "92c79b42-01f4-473a-8990-6f7cfb917f3b", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-64575008-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "50680419dd83401898f1b0ef7e36edd0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "205fb402-8eaf-4b61-8f57-8f216024179a", "external-id": "nsx-vlan-transportzone-78", "segmentation_id": 78, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0de8f88-74", "ovs_interfaceid": "b0de8f88-74eb-44c7-bb52-79dda2a16a29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1467.812021] env[67169]: DEBUG oslo_concurrency.lockutils [req-d626e06a-5015-44b9-8f0e-3fbe6675d496 req-42182837-63be-4b1c-a691-032f699bf334 service nova] Releasing lock "refresh_cache-2e156908-c313-4229-840d-13ed8e6d4074" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1468.190750] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquiring lock "2e156908-c313-4229-840d-13ed8e6d4074" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1471.157688] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "aedbfde6-26e1-410d-a311-e2c344f65062" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1471.157982] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "aedbfde6-26e1-410d-a311-e2c344f65062" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1471.183626] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "3d636f4c-c042-428f-be5d-1fbf20c61f0a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1471.184516] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "3d636f4c-c042-428f-be5d-1fbf20c61f0a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1491.666586] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1496.658513] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1496.658825] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1496.658825] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1496.680699] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.680854] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.680984] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.681126] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.681247] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.681366] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.681483] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.681600] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.681714] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.681828] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1496.681946] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1496.682418] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1496.682561] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1497.659172] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1497.659462] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1497.671541] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1497.671759] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1497.671928] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1497.672099] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1497.673214] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fd01137-56ef-4e70-b787-2e187db2b189 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.681791] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25f9df4a-90a1-451e-9dcd-387decdb6b11 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.695801] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebc41383-31e4-47e9-8b6f-f9d7c87849c3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.701805] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bc7df02-f298-4b1a-9caf-f87c83109ac4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.731623] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181014MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1497.731771] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1497.731957] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1497.853654] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.853888] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.854063] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.854197] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.854317] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.854436] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.854551] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.854663] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 74ea66f0-391c-437b-8aee-f784528d7963 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.854776] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.854887] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1497.866201] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.876019] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.885248] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 5c675d7d-8915-4962-8bbd-c9b639ae2cb1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.893987] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.902274] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 4e978c21-ae48-422e-9126-a4144c86b86f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.911511] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.920034] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3d636f4c-c042-428f-be5d-1fbf20c61f0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1497.920258] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1497.920404] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1497.935737] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Refreshing inventories for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1497.949892] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Updating ProviderTree inventory for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1497.950087] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Updating inventory in ProviderTree for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1497.960512] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Refreshing aggregate associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, aggregates: None {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1497.977740] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Refreshing trait associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1498.167921] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72b6ae8e-15fd-4310-861b-2c777c8567ad {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.175978] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d28addc7-d71f-413f-b3b7-5ac3c976af32 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.206920] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ea5dc1a-0c69-4679-9706-966a0685b66e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.214016] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-977928c6-d3b4-4d64-95bd-59cd56184e11 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.226972] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1498.235077] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1498.251846] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1498.252049] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.520s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1499.246690] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1499.247088] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1499.659201] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1500.653544] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1501.659031] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1508.974064] env[67169]: WARNING oslo_vmware.rw_handles [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1508.974064] env[67169]: ERROR oslo_vmware.rw_handles [ 1508.974064] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/cce07b6f-a01f-46a3-b6b4-3f95fc0508a3/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1508.975501] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1508.975809] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Copying Virtual Disk [datastore2] vmware_temp/cce07b6f-a01f-46a3-b6b4-3f95fc0508a3/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/cce07b6f-a01f-46a3-b6b4-3f95fc0508a3/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1508.976224] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1fbb6e0c-3f78-42e4-ab31-96bef2638c29 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.984769] env[67169]: DEBUG oslo_vmware.api [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 1508.984769] env[67169]: value = "task-2819208" [ 1508.984769] env[67169]: _type = "Task" [ 1508.984769] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1508.993076] env[67169]: DEBUG oslo_vmware.api [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': task-2819208, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1509.494717] env[67169]: DEBUG oslo_vmware.exceptions [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1509.495027] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1509.495602] env[67169]: ERROR nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1509.495602] env[67169]: Faults: ['InvalidArgument'] [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Traceback (most recent call last): [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] yield resources [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] self.driver.spawn(context, instance, image_meta, [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] self._fetch_image_if_missing(context, vi) [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] image_cache(vi, tmp_image_ds_loc) [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] vm_util.copy_virtual_disk( [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] session._wait_for_task(vmdk_copy_task) [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] return self.wait_for_task(task_ref) [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] return evt.wait() [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] result = hub.switch() [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] return self.greenlet.switch() [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] self.f(*self.args, **self.kw) [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] raise exceptions.translate_fault(task_info.error) [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Faults: ['InvalidArgument'] [ 1509.495602] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] [ 1509.496584] env[67169]: INFO nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Terminating instance [ 1509.497446] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1509.497659] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1509.497897] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-24b4bf3a-145d-4fb7-b34e-123dee225670 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.500080] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1509.500273] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1509.500982] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-086226c4-1be5-4b5e-8117-fb0268eac353 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.508068] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1509.508068] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-97679411-10dd-4b9b-a98a-ceeb8307746b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.510406] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1509.510587] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1509.511589] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cf630d72-2d83-4822-a705-377e74913aff {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.516352] env[67169]: DEBUG oslo_vmware.api [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Waiting for the task: (returnval){ [ 1509.516352] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52c6661f-0613-c9c2-0f61-61b08f8639dc" [ 1509.516352] env[67169]: _type = "Task" [ 1509.516352] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1509.523678] env[67169]: DEBUG oslo_vmware.api [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52c6661f-0613-c9c2-0f61-61b08f8639dc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1509.577800] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1509.578048] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1509.578238] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Deleting the datastore file [datastore2] 7bf839c0-3ec8-4329-823d-de1fae4833cb {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1509.578574] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2e753faf-2335-4397-8fcc-e68d9c910525 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.587128] env[67169]: DEBUG oslo_vmware.api [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 1509.587128] env[67169]: value = "task-2819210" [ 1509.587128] env[67169]: _type = "Task" [ 1509.587128] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1509.594371] env[67169]: DEBUG oslo_vmware.api [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': task-2819210, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1510.026617] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1510.026905] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Creating directory with path [datastore2] vmware_temp/45bac9a1-bae6-4ebb-a2f0-5a4c4ea77046/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1510.027208] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05c72493-70b6-4733-98b4-57313ec2a053 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.038357] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Created directory with path [datastore2] vmware_temp/45bac9a1-bae6-4ebb-a2f0-5a4c4ea77046/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1510.038357] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Fetch image to [datastore2] vmware_temp/45bac9a1-bae6-4ebb-a2f0-5a4c4ea77046/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1510.039092] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/45bac9a1-bae6-4ebb-a2f0-5a4c4ea77046/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1510.039299] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f16c6780-f62c-47ed-a49a-3022f151f678 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.045742] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f750ebf1-620d-40ec-9160-400e111478a4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.054714] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af5dce16-0339-4308-a27e-5f7dc5527f46 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.088546] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18a202f6-fdd1-406a-8998-7bfefc809a03 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.095486] env[67169]: DEBUG oslo_vmware.api [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': task-2819210, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065838} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1510.096884] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1510.097108] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1510.097298] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1510.097472] env[67169]: INFO nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1510.099207] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9a00a1e5-4093-42cf-8a4f-4d3f69cec4d8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.101082] env[67169]: DEBUG nova.compute.claims [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1510.101255] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1510.101471] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1510.124492] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1510.180362] env[67169]: DEBUG oslo_vmware.rw_handles [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/45bac9a1-bae6-4ebb-a2f0-5a4c4ea77046/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1510.239800] env[67169]: DEBUG oslo_vmware.rw_handles [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1510.240015] env[67169]: DEBUG oslo_vmware.rw_handles [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/45bac9a1-bae6-4ebb-a2f0-5a4c4ea77046/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1510.381462] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59b985ca-412b-4005-8daf-2d5d68868606 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.389014] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b8b4a11-a71e-4d19-918f-3b916f93322d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.420677] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee6f2dbb-ec55-4733-af40-de2e4f238a36 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.427451] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc1e651e-239f-4a94-8983-cb2e42ad708f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.441459] env[67169]: DEBUG nova.compute.provider_tree [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1510.449996] env[67169]: DEBUG nova.scheduler.client.report [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1510.463303] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.362s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1510.463763] env[67169]: ERROR nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1510.463763] env[67169]: Faults: ['InvalidArgument'] [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Traceback (most recent call last): [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] self.driver.spawn(context, instance, image_meta, [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] self._fetch_image_if_missing(context, vi) [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] image_cache(vi, tmp_image_ds_loc) [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] vm_util.copy_virtual_disk( [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] session._wait_for_task(vmdk_copy_task) [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] return self.wait_for_task(task_ref) [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] return evt.wait() [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] result = hub.switch() [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] return self.greenlet.switch() [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] self.f(*self.args, **self.kw) [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] raise exceptions.translate_fault(task_info.error) [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Faults: ['InvalidArgument'] [ 1510.463763] env[67169]: ERROR nova.compute.manager [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] [ 1510.464614] env[67169]: DEBUG nova.compute.utils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1510.465823] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Build of instance 7bf839c0-3ec8-4329-823d-de1fae4833cb was re-scheduled: A specified parameter was not correct: fileType [ 1510.465823] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1510.466261] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1510.466438] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1510.466607] env[67169]: DEBUG nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1510.466768] env[67169]: DEBUG nova.network.neutron [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1510.970947] env[67169]: DEBUG nova.network.neutron [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1510.984018] env[67169]: INFO nova.compute.manager [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Took 0.52 seconds to deallocate network for instance. [ 1511.108181] env[67169]: INFO nova.scheduler.client.report [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Deleted allocations for instance 7bf839c0-3ec8-4329-823d-de1fae4833cb [ 1511.127133] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0e91d283-0c2c-4495-ad84-ebaaebe9122d tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 583.085s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1511.128283] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 387.299s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1511.128583] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "7bf839c0-3ec8-4329-823d-de1fae4833cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1511.128790] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1511.128969] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1511.134744] env[67169]: INFO nova.compute.manager [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Terminating instance [ 1511.136655] env[67169]: DEBUG nova.compute.manager [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1511.136851] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1511.137386] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6ac38196-c981-4c7c-bba5-345f8a8f6c4c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1511.142953] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1511.149685] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be323dc8-3a4a-4cfe-af26-e9c5a3620d29 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1511.179648] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7bf839c0-3ec8-4329-823d-de1fae4833cb could not be found. [ 1511.179858] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1511.180045] env[67169]: INFO nova.compute.manager [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1511.180294] env[67169]: DEBUG oslo.service.loopingcall [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1511.182473] env[67169]: DEBUG nova.compute.manager [-] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1511.182576] env[67169]: DEBUG nova.network.neutron [-] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1511.196481] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1511.196647] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1511.198091] env[67169]: INFO nova.compute.claims [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1511.223059] env[67169]: DEBUG nova.network.neutron [-] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1511.255552] env[67169]: INFO nova.compute.manager [-] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] Took 0.07 seconds to deallocate network for instance. [ 1511.364829] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae68f49-9f1f-49b1-83c6-d14aaf3be879 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.236s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1511.365861] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 325.665s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1511.366055] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7bf839c0-3ec8-4329-823d-de1fae4833cb] During sync_power_state the instance has a pending task (deleting). Skip. [ 1511.366261] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "7bf839c0-3ec8-4329-823d-de1fae4833cb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1511.488015] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bbf2528-eca4-46bb-8a89-985bb3ad7559 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1511.494360] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17b73b48-3495-4f6a-90a4-789ada94bcf8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1511.526701] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc35970b-d5c6-4529-b3f5-3e51985b47e6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1511.534556] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-932e6224-f75f-462e-80d1-34247dad12c5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1511.547967] env[67169]: DEBUG nova.compute.provider_tree [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1511.560027] env[67169]: DEBUG nova.scheduler.client.report [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1511.571987] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.375s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1511.572539] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1511.606597] env[67169]: DEBUG nova.compute.utils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1511.608547] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1511.608547] env[67169]: DEBUG nova.network.neutron [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1511.617189] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1511.679939] env[67169]: DEBUG nova.policy [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc8f12a2682c4b79aabc2f87ed8678e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a5d2ec974f664a3a9407f7f3e86b4982', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1511.683058] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1511.709161] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1511.709413] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1511.709576] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1511.709756] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1511.709902] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1511.710070] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1511.710288] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1511.710448] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1511.710614] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1511.710781] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1511.710949] env[67169]: DEBUG nova.virt.hardware [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1511.712105] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80607f02-1021-41cc-b851-943a886829b7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1511.721690] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d120be79-0572-496d-af32-45b0c3f3d329 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1512.240219] env[67169]: DEBUG nova.network.neutron [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Successfully created port: 241eda85-73da-4da8-b825-10f9227a1575 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1513.075896] env[67169]: DEBUG nova.compute.manager [req-4d7534d8-5a22-40c8-88ee-e7a4eb28a309 req-9d8ee1a8-78c3-4122-a712-454c1ff228a6 service nova] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Received event network-vif-plugged-241eda85-73da-4da8-b825-10f9227a1575 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1513.076171] env[67169]: DEBUG oslo_concurrency.lockutils [req-4d7534d8-5a22-40c8-88ee-e7a4eb28a309 req-9d8ee1a8-78c3-4122-a712-454c1ff228a6 service nova] Acquiring lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1513.076389] env[67169]: DEBUG oslo_concurrency.lockutils [req-4d7534d8-5a22-40c8-88ee-e7a4eb28a309 req-9d8ee1a8-78c3-4122-a712-454c1ff228a6 service nova] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1513.076561] env[67169]: DEBUG oslo_concurrency.lockutils [req-4d7534d8-5a22-40c8-88ee-e7a4eb28a309 req-9d8ee1a8-78c3-4122-a712-454c1ff228a6 service nova] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1513.076725] env[67169]: DEBUG nova.compute.manager [req-4d7534d8-5a22-40c8-88ee-e7a4eb28a309 req-9d8ee1a8-78c3-4122-a712-454c1ff228a6 service nova] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] No waiting events found dispatching network-vif-plugged-241eda85-73da-4da8-b825-10f9227a1575 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1513.076882] env[67169]: WARNING nova.compute.manager [req-4d7534d8-5a22-40c8-88ee-e7a4eb28a309 req-9d8ee1a8-78c3-4122-a712-454c1ff228a6 service nova] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Received unexpected event network-vif-plugged-241eda85-73da-4da8-b825-10f9227a1575 for instance with vm_state building and task_state spawning. [ 1513.143457] env[67169]: DEBUG nova.network.neutron [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Successfully updated port: 241eda85-73da-4da8-b825-10f9227a1575 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1513.156611] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "refresh_cache-2d7d3386-9854-4bf1-a680-5aed0a2329cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1513.156773] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "refresh_cache-2d7d3386-9854-4bf1-a680-5aed0a2329cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1513.156922] env[67169]: DEBUG nova.network.neutron [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1513.196722] env[67169]: DEBUG nova.network.neutron [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1513.420619] env[67169]: DEBUG nova.network.neutron [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Updating instance_info_cache with network_info: [{"id": "241eda85-73da-4da8-b825-10f9227a1575", "address": "fa:16:3e:a6:ea:b6", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap241eda85-73", "ovs_interfaceid": "241eda85-73da-4da8-b825-10f9227a1575", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1513.431601] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "refresh_cache-2d7d3386-9854-4bf1-a680-5aed0a2329cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1513.431886] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Instance network_info: |[{"id": "241eda85-73da-4da8-b825-10f9227a1575", "address": "fa:16:3e:a6:ea:b6", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap241eda85-73", "ovs_interfaceid": "241eda85-73da-4da8-b825-10f9227a1575", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1513.432272] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a6:ea:b6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '56398cc0-e39f-410f-8036-8c2a6870e26f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '241eda85-73da-4da8-b825-10f9227a1575', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1513.439914] env[67169]: DEBUG oslo.service.loopingcall [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1513.440339] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1513.440568] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b8e96c30-9c4a-4d88-a1d2-60bb6492a25b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.461034] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1513.461034] env[67169]: value = "task-2819211" [ 1513.461034] env[67169]: _type = "Task" [ 1513.461034] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1513.468886] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819211, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1513.970560] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819211, 'name': CreateVM_Task, 'duration_secs': 0.314114} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1513.970732] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1513.971421] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1513.971584] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1513.971904] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1513.972189] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f2e67abc-5320-4e8f-b37e-0dc26b1a74c6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.976481] env[67169]: DEBUG oslo_vmware.api [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 1513.976481] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ce59a8-d9fa-0805-8c6a-38d980a21acf" [ 1513.976481] env[67169]: _type = "Task" [ 1513.976481] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1513.983650] env[67169]: DEBUG oslo_vmware.api [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ce59a8-d9fa-0805-8c6a-38d980a21acf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1514.486835] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1514.487175] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1514.487343] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1515.125886] env[67169]: DEBUG nova.compute.manager [req-e2da5785-fe6c-4c77-96bb-482b5f40e626 req-4bc8b720-654e-4699-8a3f-9c5db44700a1 service nova] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Received event network-changed-241eda85-73da-4da8-b825-10f9227a1575 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1515.126168] env[67169]: DEBUG nova.compute.manager [req-e2da5785-fe6c-4c77-96bb-482b5f40e626 req-4bc8b720-654e-4699-8a3f-9c5db44700a1 service nova] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Refreshing instance network info cache due to event network-changed-241eda85-73da-4da8-b825-10f9227a1575. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1515.126410] env[67169]: DEBUG oslo_concurrency.lockutils [req-e2da5785-fe6c-4c77-96bb-482b5f40e626 req-4bc8b720-654e-4699-8a3f-9c5db44700a1 service nova] Acquiring lock "refresh_cache-2d7d3386-9854-4bf1-a680-5aed0a2329cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1515.126563] env[67169]: DEBUG oslo_concurrency.lockutils [req-e2da5785-fe6c-4c77-96bb-482b5f40e626 req-4bc8b720-654e-4699-8a3f-9c5db44700a1 service nova] Acquired lock "refresh_cache-2d7d3386-9854-4bf1-a680-5aed0a2329cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1515.126747] env[67169]: DEBUG nova.network.neutron [req-e2da5785-fe6c-4c77-96bb-482b5f40e626 req-4bc8b720-654e-4699-8a3f-9c5db44700a1 service nova] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Refreshing network info cache for port 241eda85-73da-4da8-b825-10f9227a1575 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1515.594451] env[67169]: DEBUG nova.network.neutron [req-e2da5785-fe6c-4c77-96bb-482b5f40e626 req-4bc8b720-654e-4699-8a3f-9c5db44700a1 service nova] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Updated VIF entry in instance network info cache for port 241eda85-73da-4da8-b825-10f9227a1575. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1515.594818] env[67169]: DEBUG nova.network.neutron [req-e2da5785-fe6c-4c77-96bb-482b5f40e626 req-4bc8b720-654e-4699-8a3f-9c5db44700a1 service nova] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Updating instance_info_cache with network_info: [{"id": "241eda85-73da-4da8-b825-10f9227a1575", "address": "fa:16:3e:a6:ea:b6", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap241eda85-73", "ovs_interfaceid": "241eda85-73da-4da8-b825-10f9227a1575", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1515.604427] env[67169]: DEBUG oslo_concurrency.lockutils [req-e2da5785-fe6c-4c77-96bb-482b5f40e626 req-4bc8b720-654e-4699-8a3f-9c5db44700a1 service nova] Releasing lock "refresh_cache-2d7d3386-9854-4bf1-a680-5aed0a2329cb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1517.431241] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1526.564457] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquiring lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1526.564457] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1552.659784] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1557.659672] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1557.659989] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1557.659989] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1557.681305] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.681481] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.681632] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.681764] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.681886] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.682012] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.682135] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.682252] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.682370] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.682487] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1557.682605] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1557.683131] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1558.659453] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1558.659981] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1558.660271] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1558.660309] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1559.374539] env[67169]: WARNING oslo_vmware.rw_handles [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1559.374539] env[67169]: ERROR oslo_vmware.rw_handles [ 1559.375013] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/45bac9a1-bae6-4ebb-a2f0-5a4c4ea77046/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1559.377098] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1559.377336] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Copying Virtual Disk [datastore2] vmware_temp/45bac9a1-bae6-4ebb-a2f0-5a4c4ea77046/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/45bac9a1-bae6-4ebb-a2f0-5a4c4ea77046/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1559.377631] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d4035b1e-419b-47e6-b0ea-22a9a9c83637 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.386617] env[67169]: DEBUG oslo_vmware.api [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Waiting for the task: (returnval){ [ 1559.386617] env[67169]: value = "task-2819212" [ 1559.386617] env[67169]: _type = "Task" [ 1559.386617] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1559.395088] env[67169]: DEBUG oslo_vmware.api [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Task: {'id': task-2819212, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1559.659098] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1559.671334] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1559.671674] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1559.671726] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1559.671899] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1559.672952] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ab44c87-94a2-41f7-beca-ea197aae4263 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.681863] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-014f3c61-b8dc-4e6a-92f2-3bf9f20386dd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.695746] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e4eef60-fbab-4075-9693-cb3ac5ec8443 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.701986] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2836d24e-2715-465d-8ee7-fe3dfe42d67f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.733066] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180985MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1559.733241] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1559.733444] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1559.805208] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance bab5d630-fec0-44e5-8088-12c8855aad66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.805497] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.805723] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.805944] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.806413] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.806413] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.806586] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 74ea66f0-391c-437b-8aee-f784528d7963 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.806795] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.807011] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.807216] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1559.819798] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1559.831405] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 5c675d7d-8915-4962-8bbd-c9b639ae2cb1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1559.841860] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1559.852037] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 4e978c21-ae48-422e-9126-a4144c86b86f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1559.861799] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1559.871128] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3d636f4c-c042-428f-be5d-1fbf20c61f0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1559.880222] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1559.880445] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1559.880592] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1559.896965] env[67169]: DEBUG oslo_vmware.exceptions [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1559.897248] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1559.897804] env[67169]: ERROR nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1559.897804] env[67169]: Faults: ['InvalidArgument'] [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Traceback (most recent call last): [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] yield resources [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] self.driver.spawn(context, instance, image_meta, [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] self._fetch_image_if_missing(context, vi) [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] image_cache(vi, tmp_image_ds_loc) [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] vm_util.copy_virtual_disk( [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] session._wait_for_task(vmdk_copy_task) [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] return self.wait_for_task(task_ref) [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] return evt.wait() [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] result = hub.switch() [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] return self.greenlet.switch() [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] self.f(*self.args, **self.kw) [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] raise exceptions.translate_fault(task_info.error) [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Faults: ['InvalidArgument'] [ 1559.897804] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] [ 1559.898830] env[67169]: INFO nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Terminating instance [ 1559.899450] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1559.899656] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1559.900093] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0476291c-2236-4133-b7ba-d12436334c2c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.904552] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1559.904742] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1559.905471] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8146052c-5593-4ebf-88bc-6114848b293d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.912070] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1559.912292] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0b20b95d-4b88-4a25-b2d1-79dfedc6b2c4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.914497] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1559.914701] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1559.915644] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bee79a09-b0ac-403d-9cc4-0bcf64869e08 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.922985] env[67169]: DEBUG oslo_vmware.api [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1559.922985] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52732482-b59f-4315-017c-115269535977" [ 1559.922985] env[67169]: _type = "Task" [ 1559.922985] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1559.937262] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1559.937501] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating directory with path [datastore2] vmware_temp/75453b4a-714a-48c3-9686-831039ab349c/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1559.937720] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f286cfc2-477e-41d1-8a9b-406b6a89fb57 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.957923] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created directory with path [datastore2] vmware_temp/75453b4a-714a-48c3-9686-831039ab349c/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1559.958661] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Fetch image to [datastore2] vmware_temp/75453b4a-714a-48c3-9686-831039ab349c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1559.958661] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/75453b4a-714a-48c3-9686-831039ab349c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1559.959061] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7b9c448-8c4e-43df-a235-96e55c672968 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.967508] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-965ef6b1-b218-416a-b77b-4def10e109d8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.978891] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d74c3b4-9887-4e81-a01a-6ebc9e3855e2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.986043] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1559.986043] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1559.986043] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Deleting the datastore file [datastore2] bab5d630-fec0-44e5-8088-12c8855aad66 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1559.986043] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-618c488e-1ef5-49b9-b1c6-495981b69b80 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.015526] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3a777dc-1251-43b6-99e0-3a8e7ed60259 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.019460] env[67169]: DEBUG oslo_vmware.api [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Waiting for the task: (returnval){ [ 1560.019460] env[67169]: value = "task-2819214" [ 1560.019460] env[67169]: _type = "Task" [ 1560.019460] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1560.024159] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2ff89839-7fda-4410-9a61-b9385a7a48bd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.028370] env[67169]: DEBUG oslo_vmware.api [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Task: {'id': task-2819214, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1560.051190] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1560.105139] env[67169]: DEBUG oslo_vmware.rw_handles [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/75453b4a-714a-48c3-9686-831039ab349c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1560.164297] env[67169]: DEBUG oslo_vmware.rw_handles [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1560.164495] env[67169]: DEBUG oslo_vmware.rw_handles [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/75453b4a-714a-48c3-9686-831039ab349c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1560.183892] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b05fcd31-6fb2-44d6-8819-0dc2ce24035b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.191186] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0e9a62d-160d-4979-9c47-e75345af4d9d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.220761] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44a41345-6d1d-4934-98be-5d6607cbf6ca {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.227720] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1a2bd7f-86bf-4ce3-a9e2-3cde6b77fb1a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.241916] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1560.250300] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1560.263390] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1560.263573] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.530s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1560.533305] env[67169]: DEBUG oslo_vmware.api [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Task: {'id': task-2819214, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069076} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1560.533558] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1560.533742] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1560.533915] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1560.534098] env[67169]: INFO nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1560.536187] env[67169]: DEBUG nova.compute.claims [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1560.536361] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1560.536599] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1560.764954] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f0b5ef6-4ace-4b26-8c0a-2ab690cfcb50 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.772804] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6be895d5-9279-48e1-ab11-a35f9da42834 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.802033] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61c9f5cc-99b8-403c-9a55-83a8d1752f06 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.808959] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2e64987-8470-48f9-8490-1a1e41afc085 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.825317] env[67169]: DEBUG nova.compute.provider_tree [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1560.834154] env[67169]: DEBUG nova.scheduler.client.report [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1560.849872] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.313s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1560.850445] env[67169]: ERROR nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1560.850445] env[67169]: Faults: ['InvalidArgument'] [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Traceback (most recent call last): [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] self.driver.spawn(context, instance, image_meta, [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] self._fetch_image_if_missing(context, vi) [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] image_cache(vi, tmp_image_ds_loc) [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] vm_util.copy_virtual_disk( [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] session._wait_for_task(vmdk_copy_task) [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] return self.wait_for_task(task_ref) [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] return evt.wait() [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] result = hub.switch() [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] return self.greenlet.switch() [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] self.f(*self.args, **self.kw) [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] raise exceptions.translate_fault(task_info.error) [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Faults: ['InvalidArgument'] [ 1560.850445] env[67169]: ERROR nova.compute.manager [instance: bab5d630-fec0-44e5-8088-12c8855aad66] [ 1560.851280] env[67169]: DEBUG nova.compute.utils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1560.852905] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Build of instance bab5d630-fec0-44e5-8088-12c8855aad66 was re-scheduled: A specified parameter was not correct: fileType [ 1560.852905] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1560.853399] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1560.853576] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1560.853750] env[67169]: DEBUG nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1560.853914] env[67169]: DEBUG nova.network.neutron [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1561.240282] env[67169]: DEBUG nova.network.neutron [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1561.252189] env[67169]: INFO nova.compute.manager [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Took 0.40 seconds to deallocate network for instance. [ 1561.263354] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1561.356314] env[67169]: INFO nova.scheduler.client.report [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Deleted allocations for instance bab5d630-fec0-44e5-8088-12c8855aad66 [ 1561.383100] env[67169]: DEBUG oslo_concurrency.lockutils [None req-bc4f14a6-29e8-4fae-84bb-02f11896dc72 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "bab5d630-fec0-44e5-8088-12c8855aad66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 584.035s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1561.383852] env[67169]: DEBUG oslo_concurrency.lockutils [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "bab5d630-fec0-44e5-8088-12c8855aad66" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 387.175s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1561.384230] env[67169]: DEBUG oslo_concurrency.lockutils [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Acquiring lock "bab5d630-fec0-44e5-8088-12c8855aad66-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1561.384461] env[67169]: DEBUG oslo_concurrency.lockutils [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "bab5d630-fec0-44e5-8088-12c8855aad66-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1561.384644] env[67169]: DEBUG oslo_concurrency.lockutils [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "bab5d630-fec0-44e5-8088-12c8855aad66-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1561.389993] env[67169]: INFO nova.compute.manager [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Terminating instance [ 1561.391035] env[67169]: DEBUG nova.compute.manager [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1561.391258] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1561.391527] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b70bea69-6292-474a-b6c6-3addb3b382a1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.398893] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1561.404270] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a08f7d79-f952-4a9f-a2ba-b318507d6022 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.435194] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bab5d630-fec0-44e5-8088-12c8855aad66 could not be found. [ 1561.435409] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1561.435586] env[67169]: INFO nova.compute.manager [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1561.435829] env[67169]: DEBUG oslo.service.loopingcall [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1561.438141] env[67169]: DEBUG nova.compute.manager [-] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1561.438249] env[67169]: DEBUG nova.network.neutron [-] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1561.451327] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1561.452100] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1561.453104] env[67169]: INFO nova.compute.claims [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1561.473184] env[67169]: DEBUG nova.network.neutron [-] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1561.487759] env[67169]: INFO nova.compute.manager [-] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] Took 0.05 seconds to deallocate network for instance. [ 1561.583219] env[67169]: DEBUG oslo_concurrency.lockutils [None req-002e25f7-7c63-44e9-bab9-7e17a862dc86 tempest-AttachInterfacesV270Test-2005290711 tempest-AttachInterfacesV270Test-2005290711-project-member] Lock "bab5d630-fec0-44e5-8088-12c8855aad66" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1561.586521] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "bab5d630-fec0-44e5-8088-12c8855aad66" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 375.885s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1561.586772] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: bab5d630-fec0-44e5-8088-12c8855aad66] During sync_power_state the instance has a pending task (deleting). Skip. [ 1561.586955] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "bab5d630-fec0-44e5-8088-12c8855aad66" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1561.710872] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-299de9ff-675c-4941-88a3-13ab750043ed {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.719594] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93fadf17-b9a8-46d8-97d2-71b196564007 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.748630] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90a198dd-a361-4554-af25-aa1aadb765b5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.755769] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20fa546d-ef96-4df2-8269-1efd24a3fea6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.768292] env[67169]: DEBUG nova.compute.provider_tree [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1561.777184] env[67169]: DEBUG nova.scheduler.client.report [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1561.790532] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1561.790987] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1561.822150] env[67169]: DEBUG nova.compute.utils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1561.823560] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1561.823848] env[67169]: DEBUG nova.network.neutron [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1561.832457] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1561.884303] env[67169]: DEBUG nova.policy [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '615c1061ae884c3b91ce1b072249717c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1162bad4f2e4722aed4ff2c657e9dc9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1561.893682] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1561.920661] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1561.920877] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1561.921048] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1561.921244] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1561.921397] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1561.921572] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1561.922292] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1561.922381] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1561.922509] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1561.922703] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1561.923018] env[67169]: DEBUG nova.virt.hardware [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1561.923992] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52d685c6-2c09-45aa-ba40-57a7bc6e33c5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.932224] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d401461d-deb1-4439-b553-e68b8a87feaf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.346381] env[67169]: DEBUG nova.network.neutron [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Successfully created port: 62743d9b-ea5a-40e1-8f73-d45d21506993 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1563.248617] env[67169]: DEBUG nova.network.neutron [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Successfully updated port: 62743d9b-ea5a-40e1-8f73-d45d21506993 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1563.262478] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "refresh_cache-fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1563.262649] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "refresh_cache-fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1563.262809] env[67169]: DEBUG nova.network.neutron [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1563.302469] env[67169]: DEBUG nova.network.neutron [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1563.318523] env[67169]: DEBUG nova.compute.manager [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Received event network-vif-plugged-62743d9b-ea5a-40e1-8f73-d45d21506993 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1563.318821] env[67169]: DEBUG oslo_concurrency.lockutils [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] Acquiring lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1563.319044] env[67169]: DEBUG oslo_concurrency.lockutils [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1563.319216] env[67169]: DEBUG oslo_concurrency.lockutils [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1563.319383] env[67169]: DEBUG nova.compute.manager [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] No waiting events found dispatching network-vif-plugged-62743d9b-ea5a-40e1-8f73-d45d21506993 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1563.319544] env[67169]: WARNING nova.compute.manager [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Received unexpected event network-vif-plugged-62743d9b-ea5a-40e1-8f73-d45d21506993 for instance with vm_state building and task_state spawning. [ 1563.319754] env[67169]: DEBUG nova.compute.manager [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Received event network-changed-62743d9b-ea5a-40e1-8f73-d45d21506993 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1563.319854] env[67169]: DEBUG nova.compute.manager [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Refreshing instance network info cache due to event network-changed-62743d9b-ea5a-40e1-8f73-d45d21506993. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1563.320027] env[67169]: DEBUG oslo_concurrency.lockutils [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] Acquiring lock "refresh_cache-fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1563.659114] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1563.715648] env[67169]: DEBUG nova.network.neutron [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Updating instance_info_cache with network_info: [{"id": "62743d9b-ea5a-40e1-8f73-d45d21506993", "address": "fa:16:3e:a0:c2:bd", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62743d9b-ea", "ovs_interfaceid": "62743d9b-ea5a-40e1-8f73-d45d21506993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1563.726448] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "refresh_cache-fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1563.726792] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Instance network_info: |[{"id": "62743d9b-ea5a-40e1-8f73-d45d21506993", "address": "fa:16:3e:a0:c2:bd", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62743d9b-ea", "ovs_interfaceid": "62743d9b-ea5a-40e1-8f73-d45d21506993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1563.727096] env[67169]: DEBUG oslo_concurrency.lockutils [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] Acquired lock "refresh_cache-fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1563.727275] env[67169]: DEBUG nova.network.neutron [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Refreshing network info cache for port 62743d9b-ea5a-40e1-8f73-d45d21506993 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1563.728269] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a0:c2:bd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '24210a23-d8ac-4f4f-84ac-dc0636de9a72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '62743d9b-ea5a-40e1-8f73-d45d21506993', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1563.738884] env[67169]: DEBUG oslo.service.loopingcall [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1563.742364] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1563.742582] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ce80866d-4f2f-4c89-9a26-eb52950d3fed {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.762596] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1563.762596] env[67169]: value = "task-2819215" [ 1563.762596] env[67169]: _type = "Task" [ 1563.762596] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1563.769913] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819215, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1564.211153] env[67169]: DEBUG nova.network.neutron [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Updated VIF entry in instance network info cache for port 62743d9b-ea5a-40e1-8f73-d45d21506993. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1564.211528] env[67169]: DEBUG nova.network.neutron [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Updating instance_info_cache with network_info: [{"id": "62743d9b-ea5a-40e1-8f73-d45d21506993", "address": "fa:16:3e:a0:c2:bd", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62743d9b-ea", "ovs_interfaceid": "62743d9b-ea5a-40e1-8f73-d45d21506993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1564.221494] env[67169]: DEBUG oslo_concurrency.lockutils [req-d8f7368f-7473-41a5-b92c-633e3f5b4459 req-5b0e0b84-1979-4051-9ce3-828cb4511fd5 service nova] Releasing lock "refresh_cache-fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1564.273008] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819215, 'name': CreateVM_Task, 'duration_secs': 0.291651} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1564.273335] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1564.273819] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1564.273978] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1564.274344] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1564.274655] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6fadd452-038b-4fb1-9a32-7f42de2c921b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1564.279011] env[67169]: DEBUG oslo_vmware.api [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1564.279011] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]529f4b4d-d241-0387-f6d6-868b3d2a8c04" [ 1564.279011] env[67169]: _type = "Task" [ 1564.279011] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1564.288521] env[67169]: DEBUG oslo_vmware.api [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]529f4b4d-d241-0387-f6d6-868b3d2a8c04, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1564.789481] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1564.789637] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1564.789848] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1572.256228] env[67169]: DEBUG oslo_concurrency.lockutils [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1574.204306] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "9435574d-2128-4b20-ba92-ee2aba37d33b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1574.204625] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1605.637644] env[67169]: WARNING oslo_vmware.rw_handles [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1605.637644] env[67169]: ERROR oslo_vmware.rw_handles [ 1605.638394] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/75453b4a-714a-48c3-9686-831039ab349c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1605.640144] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1605.640382] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Copying Virtual Disk [datastore2] vmware_temp/75453b4a-714a-48c3-9686-831039ab349c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/75453b4a-714a-48c3-9686-831039ab349c/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1605.640672] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8674b8df-22cf-4a75-bec9-b5a603bbfb1b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1605.649453] env[67169]: DEBUG oslo_vmware.api [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1605.649453] env[67169]: value = "task-2819216" [ 1605.649453] env[67169]: _type = "Task" [ 1605.649453] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1605.657456] env[67169]: DEBUG oslo_vmware.api [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819216, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1606.159300] env[67169]: DEBUG oslo_vmware.exceptions [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1606.159572] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1606.160206] env[67169]: ERROR nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1606.160206] env[67169]: Faults: ['InvalidArgument'] [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Traceback (most recent call last): [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] yield resources [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] self.driver.spawn(context, instance, image_meta, [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] self._fetch_image_if_missing(context, vi) [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] image_cache(vi, tmp_image_ds_loc) [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] vm_util.copy_virtual_disk( [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] session._wait_for_task(vmdk_copy_task) [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] return self.wait_for_task(task_ref) [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] return evt.wait() [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] result = hub.switch() [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] return self.greenlet.switch() [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] self.f(*self.args, **self.kw) [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] raise exceptions.translate_fault(task_info.error) [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Faults: ['InvalidArgument'] [ 1606.160206] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] [ 1606.161253] env[67169]: INFO nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Terminating instance [ 1606.162078] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1606.162300] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1606.162538] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3c03eea7-894a-49a1-8802-5e0a22caee2b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.164955] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1606.165171] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1606.165877] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31730865-f2b7-4c23-809c-594a701face4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.172287] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1606.172495] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0c87b37c-7eb3-4b05-8b45-a0757c150276 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.174574] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1606.174746] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1606.175720] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ca69cf19-0009-44a0-922f-16b3b8b65f39 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.180690] env[67169]: DEBUG oslo_vmware.api [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 1606.180690] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52527afc-212b-461f-0868-3747c0eaa7d2" [ 1606.180690] env[67169]: _type = "Task" [ 1606.180690] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1606.194286] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1606.194501] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating directory with path [datastore2] vmware_temp/dd9ab342-5bf2-48d5-b4f4-a551902138a4/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1606.194706] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-37ae7607-d324-4e96-8fa3-9c51cca7e2a2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.214451] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created directory with path [datastore2] vmware_temp/dd9ab342-5bf2-48d5-b4f4-a551902138a4/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1606.214638] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Fetch image to [datastore2] vmware_temp/dd9ab342-5bf2-48d5-b4f4-a551902138a4/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1606.214825] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/dd9ab342-5bf2-48d5-b4f4-a551902138a4/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1606.215570] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-349dbf4a-7c8f-41ec-89a9-5e6b8d568891 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.222164] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efd5983d-1f97-4f27-9dc0-6ff8123f17d3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.231115] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39dfd90b-a74f-4110-9e8e-d10956515c73 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.238730] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1606.238930] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1606.239130] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleting the datastore file [datastore2] a86fa702-2040-4e22-9eaa-5d64bc16f036 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1606.263841] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f0ab0d88-eb8c-4ec7-97a2-05a942711c6c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.266247] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a02a064-af83-4e33-9dee-36db27789bcd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.273228] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e1abb9c7-1958-4b5c-950f-cadf69963a30 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1606.275014] env[67169]: DEBUG oslo_vmware.api [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1606.275014] env[67169]: value = "task-2819218" [ 1606.275014] env[67169]: _type = "Task" [ 1606.275014] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1606.282580] env[67169]: DEBUG oslo_vmware.api [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819218, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1606.296656] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1606.346220] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dd9ab342-5bf2-48d5-b4f4-a551902138a4/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1606.406017] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1606.406237] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dd9ab342-5bf2-48d5-b4f4-a551902138a4/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1606.785489] env[67169]: DEBUG oslo_vmware.api [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819218, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069857} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1606.785872] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1606.785922] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1606.786125] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1606.786308] env[67169]: INFO nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1606.788430] env[67169]: DEBUG nova.compute.claims [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1606.788621] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1606.788849] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.032969] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b7d3ea0-cb1c-4b43-a8ef-ffd6c4859427 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.040711] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15e85f9e-f220-4f60-a8de-1f1c0f22209b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.071327] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fd1c873-70c4-46f0-b3b8-41f45a91d42e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.078677] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09676903-1c19-43a4-91fd-4173f9be9fb6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.091926] env[67169]: DEBUG nova.compute.provider_tree [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1607.102011] env[67169]: DEBUG nova.scheduler.client.report [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1607.115650] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.327s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1607.116220] env[67169]: ERROR nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1607.116220] env[67169]: Faults: ['InvalidArgument'] [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Traceback (most recent call last): [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] self.driver.spawn(context, instance, image_meta, [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] self._fetch_image_if_missing(context, vi) [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] image_cache(vi, tmp_image_ds_loc) [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] vm_util.copy_virtual_disk( [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] session._wait_for_task(vmdk_copy_task) [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] return self.wait_for_task(task_ref) [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] return evt.wait() [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] result = hub.switch() [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] return self.greenlet.switch() [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] self.f(*self.args, **self.kw) [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] raise exceptions.translate_fault(task_info.error) [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Faults: ['InvalidArgument'] [ 1607.116220] env[67169]: ERROR nova.compute.manager [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] [ 1607.117152] env[67169]: DEBUG nova.compute.utils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1607.118483] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Build of instance a86fa702-2040-4e22-9eaa-5d64bc16f036 was re-scheduled: A specified parameter was not correct: fileType [ 1607.118483] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1607.118853] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1607.119055] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1607.119233] env[67169]: DEBUG nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1607.119462] env[67169]: DEBUG nova.network.neutron [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1607.600656] env[67169]: DEBUG nova.network.neutron [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1607.611423] env[67169]: INFO nova.compute.manager [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Took 0.49 seconds to deallocate network for instance. [ 1607.710259] env[67169]: INFO nova.scheduler.client.report [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleted allocations for instance a86fa702-2040-4e22-9eaa-5d64bc16f036 [ 1607.733661] env[67169]: DEBUG oslo_concurrency.lockutils [None req-e3b2eed0-520e-4d7c-949f-bbfa16b0198a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.978s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1607.734854] env[67169]: DEBUG oslo_concurrency.lockutils [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 429.365s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.735186] env[67169]: DEBUG oslo_concurrency.lockutils [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "a86fa702-2040-4e22-9eaa-5d64bc16f036-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1607.735405] env[67169]: DEBUG oslo_concurrency.lockutils [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.735572] env[67169]: DEBUG oslo_concurrency.lockutils [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1607.738197] env[67169]: INFO nova.compute.manager [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Terminating instance [ 1607.739715] env[67169]: DEBUG nova.compute.manager [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1607.739920] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1607.740399] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d5831f35-0174-4b59-8be5-8b5f312a2843 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.750509] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c188597-7fbe-4b68-92d0-b81d48e15541 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.761684] env[67169]: DEBUG nova.compute.manager [None req-5b0ce0d8-354b-4a12-bc67-271d95a08bb6 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] [instance: 5c675d7d-8915-4962-8bbd-c9b639ae2cb1] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1607.782054] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a86fa702-2040-4e22-9eaa-5d64bc16f036 could not be found. [ 1607.782284] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1607.782460] env[67169]: INFO nova.compute.manager [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1607.782693] env[67169]: DEBUG oslo.service.loopingcall [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1607.782910] env[67169]: DEBUG nova.compute.manager [-] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1607.783016] env[67169]: DEBUG nova.network.neutron [-] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1607.785675] env[67169]: DEBUG nova.compute.manager [None req-5b0ce0d8-354b-4a12-bc67-271d95a08bb6 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] [instance: 5c675d7d-8915-4962-8bbd-c9b639ae2cb1] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1607.807110] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5b0ce0d8-354b-4a12-bc67-271d95a08bb6 tempest-AttachVolumeTestJSON-1669563252 tempest-AttachVolumeTestJSON-1669563252-project-member] Lock "5c675d7d-8915-4962-8bbd-c9b639ae2cb1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.083s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1607.811692] env[67169]: DEBUG nova.network.neutron [-] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1607.817841] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1607.820629] env[67169]: INFO nova.compute.manager [-] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] Took 0.04 seconds to deallocate network for instance. [ 1607.868190] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1607.868446] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.869987] env[67169]: INFO nova.compute.claims [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1607.923909] env[67169]: DEBUG oslo_concurrency.lockutils [None req-01a0cbac-e2cb-4d1a-98ca-95aa894d437a tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.189s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1607.924808] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 422.223s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.925016] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: a86fa702-2040-4e22-9eaa-5d64bc16f036] During sync_power_state the instance has a pending task (deleting). Skip. [ 1607.925219] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "a86fa702-2040-4e22-9eaa-5d64bc16f036" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1608.115366] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8318eb3-94d8-4d2a-91d1-7e21ea91c24f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.124336] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f54528e5-fe5d-432e-82d1-82e87d60d400 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.155066] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21c2025c-4620-4489-8136-0a49a441f11c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.162081] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f283bcc6-4543-4ec5-91e9-97978e3fb5af {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.175212] env[67169]: DEBUG nova.compute.provider_tree [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1608.183836] env[67169]: DEBUG nova.scheduler.client.report [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1608.197346] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1608.197809] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1608.232412] env[67169]: DEBUG nova.compute.utils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1608.234807] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1608.235084] env[67169]: DEBUG nova.network.neutron [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1608.243336] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1608.299608] env[67169]: DEBUG nova.policy [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d789ec14c2b4d62be952753fb47f0f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '00d358bc61014b5cb3ddcdab7785e7e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1608.308404] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1608.334086] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1608.334482] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1608.334482] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1608.334668] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1608.334813] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1608.334960] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1608.335185] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1608.335413] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1608.335504] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1608.335663] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1608.335833] env[67169]: DEBUG nova.virt.hardware [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1608.336700] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1d552a9-40fc-4157-8fee-95c03036242a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.344668] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8751db7a-f08b-4887-8866-b5345a650938 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.769684] env[67169]: DEBUG nova.network.neutron [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Successfully created port: 07aab849-dec4-4445-b78a-e9f583540343 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1609.574262] env[67169]: DEBUG nova.network.neutron [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Successfully updated port: 07aab849-dec4-4445-b78a-e9f583540343 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1609.586306] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "refresh_cache-04d3ae51-f3f1-427b-ae45-279b02e4b3e6" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1609.586306] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "refresh_cache-04d3ae51-f3f1-427b-ae45-279b02e4b3e6" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1609.586477] env[67169]: DEBUG nova.network.neutron [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1609.643255] env[67169]: DEBUG nova.network.neutron [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1609.647271] env[67169]: DEBUG nova.compute.manager [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Received event network-vif-plugged-07aab849-dec4-4445-b78a-e9f583540343 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1609.647363] env[67169]: DEBUG oslo_concurrency.lockutils [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] Acquiring lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1609.647566] env[67169]: DEBUG oslo_concurrency.lockutils [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1609.647736] env[67169]: DEBUG oslo_concurrency.lockutils [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1609.647901] env[67169]: DEBUG nova.compute.manager [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] No waiting events found dispatching network-vif-plugged-07aab849-dec4-4445-b78a-e9f583540343 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1609.648082] env[67169]: WARNING nova.compute.manager [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Received unexpected event network-vif-plugged-07aab849-dec4-4445-b78a-e9f583540343 for instance with vm_state building and task_state spawning. [ 1609.648247] env[67169]: DEBUG nova.compute.manager [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Received event network-changed-07aab849-dec4-4445-b78a-e9f583540343 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1609.648403] env[67169]: DEBUG nova.compute.manager [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Refreshing instance network info cache due to event network-changed-07aab849-dec4-4445-b78a-e9f583540343. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1609.648568] env[67169]: DEBUG oslo_concurrency.lockutils [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] Acquiring lock "refresh_cache-04d3ae51-f3f1-427b-ae45-279b02e4b3e6" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1609.873517] env[67169]: DEBUG nova.network.neutron [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Updating instance_info_cache with network_info: [{"id": "07aab849-dec4-4445-b78a-e9f583540343", "address": "fa:16:3e:bd:25:64", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07aab849-de", "ovs_interfaceid": "07aab849-dec4-4445-b78a-e9f583540343", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1609.890881] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "refresh_cache-04d3ae51-f3f1-427b-ae45-279b02e4b3e6" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1609.891417] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Instance network_info: |[{"id": "07aab849-dec4-4445-b78a-e9f583540343", "address": "fa:16:3e:bd:25:64", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07aab849-de", "ovs_interfaceid": "07aab849-dec4-4445-b78a-e9f583540343", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1609.891858] env[67169]: DEBUG oslo_concurrency.lockutils [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] Acquired lock "refresh_cache-04d3ae51-f3f1-427b-ae45-279b02e4b3e6" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1609.892239] env[67169]: DEBUG nova.network.neutron [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Refreshing network info cache for port 07aab849-dec4-4445-b78a-e9f583540343 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1609.894098] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bd:25:64', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '20e3f794-c7a3-4696-9488-ecf34c570ef9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '07aab849-dec4-4445-b78a-e9f583540343', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1609.903480] env[67169]: DEBUG oslo.service.loopingcall [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1609.906961] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1609.908276] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3e3acebd-43bc-458f-8140-ce65e911210a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.930506] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1609.930506] env[67169]: value = "task-2819219" [ 1609.930506] env[67169]: _type = "Task" [ 1609.930506] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1609.938498] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819219, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1610.150804] env[67169]: DEBUG nova.network.neutron [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Updated VIF entry in instance network info cache for port 07aab849-dec4-4445-b78a-e9f583540343. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1610.151391] env[67169]: DEBUG nova.network.neutron [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Updating instance_info_cache with network_info: [{"id": "07aab849-dec4-4445-b78a-e9f583540343", "address": "fa:16:3e:bd:25:64", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07aab849-de", "ovs_interfaceid": "07aab849-dec4-4445-b78a-e9f583540343", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1610.161302] env[67169]: DEBUG oslo_concurrency.lockutils [req-8a183ef2-97e4-4da9-bbda-9a05e599830c req-32486e05-9057-4f54-9ea4-e75dd3d0fcfa service nova] Releasing lock "refresh_cache-04d3ae51-f3f1-427b-ae45-279b02e4b3e6" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1610.441405] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819219, 'name': CreateVM_Task, 'duration_secs': 0.281318} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1610.441586] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1610.442273] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1610.442463] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1610.442831] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1610.443023] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cf1e539c-08b0-4dc6-9150-e82d20d707df {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.447797] env[67169]: DEBUG oslo_vmware.api [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 1610.447797] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]526abf57-9bec-29a3-971a-26885551667b" [ 1610.447797] env[67169]: _type = "Task" [ 1610.447797] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1610.454976] env[67169]: DEBUG oslo_vmware.api [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]526abf57-9bec-29a3-971a-26885551667b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1610.957722] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1610.958105] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1610.958211] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1613.658510] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1615.448213] env[67169]: DEBUG oslo_concurrency.lockutils [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1617.659155] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1617.659448] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1617.659500] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1617.679871] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.680040] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.680176] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.680304] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.680425] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.680543] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.680660] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.680774] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.680888] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.681016] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1617.681163] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1618.659803] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1618.660137] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1618.660332] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1618.660482] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1619.659198] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1620.659057] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1620.659057] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1620.669803] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1620.670041] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1620.670223] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1620.670387] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1620.671582] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2a2f2a3-7a69-41d2-89ba-b4b8a82b10f5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1620.680469] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bfa5f41-0b92-48ee-87cd-ba0267ca51e2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1620.694269] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b14412a-85da-47dc-89bf-cb157826577e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1620.700716] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4acca6bf-1294-4788-844e-ddc3a4a44c5b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1620.729104] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181033MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1620.729247] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1620.729585] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1620.801836] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.801998] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.802181] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.802295] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.802402] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 74ea66f0-391c-437b-8aee-f784528d7963 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.802511] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.802757] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.802757] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.803155] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.803155] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1620.817677] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 4e978c21-ae48-422e-9126-a4144c86b86f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1620.829451] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1620.842261] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 3d636f4c-c042-428f-be5d-1fbf20c61f0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1620.852094] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1620.862469] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1620.862707] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1620.862857] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1621.043376] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31e9d202-25ab-466a-b497-89a109f69f75 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.051061] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9118c01-bfcb-4d71-b420-07ac1a5021b6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.080290] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db58031b-4f77-46c9-a286-9ce99773b5a9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.087385] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2406ca17-44e1-4050-b814-d514acc808bf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.100173] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1621.108771] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1621.123374] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1621.123577] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.394s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1625.120712] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1625.141573] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1651.040379] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquiring lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1651.040379] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1652.082690] env[67169]: WARNING oslo_vmware.rw_handles [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1652.082690] env[67169]: ERROR oslo_vmware.rw_handles [ 1652.083478] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/dd9ab342-5bf2-48d5-b4f4-a551902138a4/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1652.085012] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1652.085261] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Copying Virtual Disk [datastore2] vmware_temp/dd9ab342-5bf2-48d5-b4f4-a551902138a4/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/dd9ab342-5bf2-48d5-b4f4-a551902138a4/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1652.085945] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4c0b29c0-b923-4f04-b2e7-1aac4b3d0c55 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.093707] env[67169]: DEBUG oslo_vmware.api [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 1652.093707] env[67169]: value = "task-2819220" [ 1652.093707] env[67169]: _type = "Task" [ 1652.093707] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1652.101428] env[67169]: DEBUG oslo_vmware.api [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819220, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1652.603899] env[67169]: DEBUG oslo_vmware.exceptions [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1652.604202] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1652.604829] env[67169]: ERROR nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1652.604829] env[67169]: Faults: ['InvalidArgument'] [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Traceback (most recent call last): [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] yield resources [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] self.driver.spawn(context, instance, image_meta, [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] self._fetch_image_if_missing(context, vi) [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] image_cache(vi, tmp_image_ds_loc) [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] vm_util.copy_virtual_disk( [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] session._wait_for_task(vmdk_copy_task) [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] return self.wait_for_task(task_ref) [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] return evt.wait() [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] result = hub.switch() [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] return self.greenlet.switch() [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] self.f(*self.args, **self.kw) [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] raise exceptions.translate_fault(task_info.error) [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Faults: ['InvalidArgument'] [ 1652.604829] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] [ 1652.605853] env[67169]: INFO nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Terminating instance [ 1652.606640] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1652.606848] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1652.607101] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5113f505-e99c-44f7-a65a-b0043ebe601a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.609404] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1652.609595] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1652.610307] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22145eca-4521-4a0e-a0ff-52d0731c9e53 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.616897] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1652.617847] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-21a2e2e5-fc2b-429c-82cc-9ee14f56b99c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.619186] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1652.619357] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1652.620036] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0153e215-6209-4323-814d-4a3b9b0307f7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.625324] env[67169]: DEBUG oslo_vmware.api [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for the task: (returnval){ [ 1652.625324] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f5fbe6-00a2-d0e8-fe6f-552294070f8c" [ 1652.625324] env[67169]: _type = "Task" [ 1652.625324] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1652.633784] env[67169]: DEBUG oslo_vmware.api [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f5fbe6-00a2-d0e8-fe6f-552294070f8c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1652.697212] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1652.697505] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1652.697739] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleting the datastore file [datastore2] 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1652.698023] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6b27befb-05dc-4b00-ae85-ea7f828889de {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.704250] env[67169]: DEBUG oslo_vmware.api [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 1652.704250] env[67169]: value = "task-2819222" [ 1652.704250] env[67169]: _type = "Task" [ 1652.704250] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1652.711834] env[67169]: DEBUG oslo_vmware.api [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819222, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1653.135671] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1653.136078] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Creating directory with path [datastore2] vmware_temp/40fd0b20-2302-49af-9a4a-b6c4d2e52f85/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1653.136232] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4a9c7e47-0d02-4aa9-a446-d7489aea203f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.147351] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Created directory with path [datastore2] vmware_temp/40fd0b20-2302-49af-9a4a-b6c4d2e52f85/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1653.147601] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Fetch image to [datastore2] vmware_temp/40fd0b20-2302-49af-9a4a-b6c4d2e52f85/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1653.147830] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/40fd0b20-2302-49af-9a4a-b6c4d2e52f85/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1653.148574] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32f9068c-6c19-4505-99fe-b3b9cc26c12b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.155121] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d98f5501-174a-408b-8ac8-4b2cdd3391ec {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.164015] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2373b7d-b34a-43db-9d74-9e62bcbf542a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.195476] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d622bb85-5e94-4fb4-992f-39c44f812a9e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.201373] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2848ad52-ef06-474d-b399-8207a0fb7e78 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.212735] env[67169]: DEBUG oslo_vmware.api [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819222, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066847} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1653.212967] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1653.213179] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1653.213358] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1653.213599] env[67169]: INFO nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1653.215610] env[67169]: DEBUG nova.compute.claims [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1653.215783] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1653.216013] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1653.224967] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1653.274850] env[67169]: DEBUG oslo_vmware.rw_handles [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/40fd0b20-2302-49af-9a4a-b6c4d2e52f85/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1653.336237] env[67169]: DEBUG oslo_vmware.rw_handles [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1653.336437] env[67169]: DEBUG oslo_vmware.rw_handles [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/40fd0b20-2302-49af-9a4a-b6c4d2e52f85/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1653.483238] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f7a3224-5d52-41a0-afa0-f8dc855bf351 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.490874] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89d50b9f-7941-49f2-974f-719545ab1cf0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.519912] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc5aa337-93ab-4420-8682-3082c69e9d07 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.526487] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db5fd5bc-2a6f-400e-b532-dadc4a6dbe39 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.539085] env[67169]: DEBUG nova.compute.provider_tree [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1653.547695] env[67169]: DEBUG nova.scheduler.client.report [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1653.561972] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.346s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1653.562543] env[67169]: ERROR nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1653.562543] env[67169]: Faults: ['InvalidArgument'] [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Traceback (most recent call last): [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] self.driver.spawn(context, instance, image_meta, [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] self._fetch_image_if_missing(context, vi) [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] image_cache(vi, tmp_image_ds_loc) [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] vm_util.copy_virtual_disk( [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] session._wait_for_task(vmdk_copy_task) [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] return self.wait_for_task(task_ref) [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] return evt.wait() [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] result = hub.switch() [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] return self.greenlet.switch() [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] self.f(*self.args, **self.kw) [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] raise exceptions.translate_fault(task_info.error) [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Faults: ['InvalidArgument'] [ 1653.562543] env[67169]: ERROR nova.compute.manager [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] [ 1653.563443] env[67169]: DEBUG nova.compute.utils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1653.564558] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Build of instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec was re-scheduled: A specified parameter was not correct: fileType [ 1653.564558] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1653.564932] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1653.565125] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1653.565301] env[67169]: DEBUG nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1653.565479] env[67169]: DEBUG nova.network.neutron [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1653.936292] env[67169]: DEBUG nova.network.neutron [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1653.949426] env[67169]: INFO nova.compute.manager [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Took 0.38 seconds to deallocate network for instance. [ 1654.059532] env[67169]: INFO nova.scheduler.client.report [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleted allocations for instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec [ 1654.081370] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ed9e34ee-fd50-4e03-852b-e42a37baf99a tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 629.388s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.082604] env[67169]: DEBUG oslo_concurrency.lockutils [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.065s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1654.082890] env[67169]: DEBUG oslo_concurrency.lockutils [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1654.083113] env[67169]: DEBUG oslo_concurrency.lockutils [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1654.083291] env[67169]: DEBUG oslo_concurrency.lockutils [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.085236] env[67169]: INFO nova.compute.manager [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Terminating instance [ 1654.087521] env[67169]: DEBUG nova.compute.manager [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1654.087521] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1654.087903] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-61a89413-d7fd-4dc4-b2d9-721fb7fc914f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.097580] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e6dfa97-51ae-45e4-ad87-55ece5e1bd2a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.108012] env[67169]: DEBUG nova.compute.manager [None req-63dec5b3-3efd-41a9-be63-b4fd3bdaef13 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] [instance: 4e978c21-ae48-422e-9126-a4144c86b86f] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1654.130381] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec could not be found. [ 1654.130589] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1654.130769] env[67169]: INFO nova.compute.manager [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1654.131014] env[67169]: DEBUG oslo.service.loopingcall [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1654.131251] env[67169]: DEBUG nova.compute.manager [-] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1654.131348] env[67169]: DEBUG nova.network.neutron [-] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1654.134635] env[67169]: DEBUG nova.compute.manager [None req-63dec5b3-3efd-41a9-be63-b4fd3bdaef13 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] [instance: 4e978c21-ae48-422e-9126-a4144c86b86f] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1654.158188] env[67169]: DEBUG oslo_concurrency.lockutils [None req-63dec5b3-3efd-41a9-be63-b4fd3bdaef13 tempest-AttachVolumeNegativeTest-2045904794 tempest-AttachVolumeNegativeTest-2045904794-project-member] Lock "4e978c21-ae48-422e-9126-a4144c86b86f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.939s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.161633] env[67169]: DEBUG nova.network.neutron [-] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1654.168160] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1654.170936] env[67169]: INFO nova.compute.manager [-] [instance: 37d7b647-f1ab-494a-8b5a-8e25eec0b9ec] Took 0.04 seconds to deallocate network for instance. [ 1654.217119] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1654.217373] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1654.218762] env[67169]: INFO nova.compute.claims [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1654.264633] env[67169]: DEBUG oslo_concurrency.lockutils [None req-192fb336-cf94-4f7d-b677-3d97ac548bab tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "37d7b647-f1ab-494a-8b5a-8e25eec0b9ec" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.421173] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df4d81f9-09c1-4222-9c18-783f25c28ed8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.428751] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b141024b-7c2f-423c-90fc-d23922ac9002 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.458872] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1190b312-fe6f-4558-86f4-7bc7f044f21e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.465803] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84e937ae-1418-41fa-8009-f5895aea595a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.478513] env[67169]: DEBUG nova.compute.provider_tree [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1654.486903] env[67169]: DEBUG nova.scheduler.client.report [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1654.502110] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1654.502564] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1654.533409] env[67169]: DEBUG nova.compute.utils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1654.534866] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1654.535130] env[67169]: DEBUG nova.network.neutron [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1654.545215] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1654.606438] env[67169]: DEBUG nova.policy [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1cd77c2a5f07460da364f0ec256c5f1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '37492586ebba45c7893955c459766b5d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1654.617351] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1654.645451] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1654.645812] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1654.646075] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1654.646343] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1654.646555] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1654.646770] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1654.647070] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1654.647302] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1654.647579] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1654.647817] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1654.648089] env[67169]: DEBUG nova.virt.hardware [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1654.649348] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-329896fa-7c82-45a4-963e-1f14f08dd285 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.660875] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3103452d-cbc3-4490-94e6-51e1cf14021a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1655.024812] env[67169]: DEBUG nova.network.neutron [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Successfully created port: 336c8a2d-ede1-4b7e-93f7-9e9d161be209 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1655.994170] env[67169]: DEBUG nova.compute.manager [req-1c487591-bf67-4574-be22-8ef8a37d8a61 req-e055b7d5-0874-4b6b-aa9b-29cfe11878a3 service nova] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Received event network-vif-plugged-336c8a2d-ede1-4b7e-93f7-9e9d161be209 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1655.994485] env[67169]: DEBUG oslo_concurrency.lockutils [req-1c487591-bf67-4574-be22-8ef8a37d8a61 req-e055b7d5-0874-4b6b-aa9b-29cfe11878a3 service nova] Acquiring lock "aedbfde6-26e1-410d-a311-e2c344f65062-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1655.994714] env[67169]: DEBUG oslo_concurrency.lockutils [req-1c487591-bf67-4574-be22-8ef8a37d8a61 req-e055b7d5-0874-4b6b-aa9b-29cfe11878a3 service nova] Lock "aedbfde6-26e1-410d-a311-e2c344f65062-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1655.994925] env[67169]: DEBUG oslo_concurrency.lockutils [req-1c487591-bf67-4574-be22-8ef8a37d8a61 req-e055b7d5-0874-4b6b-aa9b-29cfe11878a3 service nova] Lock "aedbfde6-26e1-410d-a311-e2c344f65062-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1655.995167] env[67169]: DEBUG nova.compute.manager [req-1c487591-bf67-4574-be22-8ef8a37d8a61 req-e055b7d5-0874-4b6b-aa9b-29cfe11878a3 service nova] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] No waiting events found dispatching network-vif-plugged-336c8a2d-ede1-4b7e-93f7-9e9d161be209 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1655.995272] env[67169]: WARNING nova.compute.manager [req-1c487591-bf67-4574-be22-8ef8a37d8a61 req-e055b7d5-0874-4b6b-aa9b-29cfe11878a3 service nova] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Received unexpected event network-vif-plugged-336c8a2d-ede1-4b7e-93f7-9e9d161be209 for instance with vm_state building and task_state spawning. [ 1656.154725] env[67169]: DEBUG nova.network.neutron [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Successfully updated port: 336c8a2d-ede1-4b7e-93f7-9e9d161be209 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1656.169403] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "refresh_cache-aedbfde6-26e1-410d-a311-e2c344f65062" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1656.169557] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquired lock "refresh_cache-aedbfde6-26e1-410d-a311-e2c344f65062" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1656.169708] env[67169]: DEBUG nova.network.neutron [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1656.234121] env[67169]: DEBUG nova.network.neutron [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1656.402138] env[67169]: DEBUG nova.network.neutron [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Updating instance_info_cache with network_info: [{"id": "336c8a2d-ede1-4b7e-93f7-9e9d161be209", "address": "fa:16:3e:23:6e:ab", "network": {"id": "b61be2af-391d-401b-8e5f-b343a30fd98f", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-771864181-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "37492586ebba45c7893955c459766b5d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56834f67-27a8-43dc-bbc6-a74aaa08959b", "external-id": "nsx-vlan-transportzone-949", "segmentation_id": 949, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap336c8a2d-ed", "ovs_interfaceid": "336c8a2d-ede1-4b7e-93f7-9e9d161be209", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1656.414637] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Releasing lock "refresh_cache-aedbfde6-26e1-410d-a311-e2c344f65062" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1656.414921] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Instance network_info: |[{"id": "336c8a2d-ede1-4b7e-93f7-9e9d161be209", "address": "fa:16:3e:23:6e:ab", "network": {"id": "b61be2af-391d-401b-8e5f-b343a30fd98f", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-771864181-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "37492586ebba45c7893955c459766b5d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56834f67-27a8-43dc-bbc6-a74aaa08959b", "external-id": "nsx-vlan-transportzone-949", "segmentation_id": 949, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap336c8a2d-ed", "ovs_interfaceid": "336c8a2d-ede1-4b7e-93f7-9e9d161be209", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1656.415321] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:23:6e:ab', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '56834f67-27a8-43dc-bbc6-a74aaa08959b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '336c8a2d-ede1-4b7e-93f7-9e9d161be209', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1656.423113] env[67169]: DEBUG oslo.service.loopingcall [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1656.423601] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1656.423848] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fca498ca-93dc-4b49-b684-c47d4ff448ba {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.443784] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1656.443784] env[67169]: value = "task-2819223" [ 1656.443784] env[67169]: _type = "Task" [ 1656.443784] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1656.451030] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819223, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1656.954737] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819223, 'name': CreateVM_Task, 'duration_secs': 0.287942} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1656.954921] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1656.955619] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1656.955796] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1656.956115] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1656.956361] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-322f3417-8f4c-4fb1-896e-fcda90be131a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.960678] env[67169]: DEBUG oslo_vmware.api [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for the task: (returnval){ [ 1656.960678] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5223f2cd-7fcf-75d3-76b3-5c043f9c0604" [ 1656.960678] env[67169]: _type = "Task" [ 1656.960678] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1656.970101] env[67169]: DEBUG oslo_vmware.api [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5223f2cd-7fcf-75d3-76b3-5c043f9c0604, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1657.472102] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1657.472102] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1657.472102] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1658.019367] env[67169]: DEBUG nova.compute.manager [req-9b3290ca-87e8-4ea8-9809-75ab7b4a4bae req-f225ee5c-6fe3-47e1-9c59-0076f8bd27e8 service nova] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Received event network-changed-336c8a2d-ede1-4b7e-93f7-9e9d161be209 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1658.019588] env[67169]: DEBUG nova.compute.manager [req-9b3290ca-87e8-4ea8-9809-75ab7b4a4bae req-f225ee5c-6fe3-47e1-9c59-0076f8bd27e8 service nova] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Refreshing instance network info cache due to event network-changed-336c8a2d-ede1-4b7e-93f7-9e9d161be209. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1658.019804] env[67169]: DEBUG oslo_concurrency.lockutils [req-9b3290ca-87e8-4ea8-9809-75ab7b4a4bae req-f225ee5c-6fe3-47e1-9c59-0076f8bd27e8 service nova] Acquiring lock "refresh_cache-aedbfde6-26e1-410d-a311-e2c344f65062" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1658.019950] env[67169]: DEBUG oslo_concurrency.lockutils [req-9b3290ca-87e8-4ea8-9809-75ab7b4a4bae req-f225ee5c-6fe3-47e1-9c59-0076f8bd27e8 service nova] Acquired lock "refresh_cache-aedbfde6-26e1-410d-a311-e2c344f65062" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1658.020533] env[67169]: DEBUG nova.network.neutron [req-9b3290ca-87e8-4ea8-9809-75ab7b4a4bae req-f225ee5c-6fe3-47e1-9c59-0076f8bd27e8 service nova] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Refreshing network info cache for port 336c8a2d-ede1-4b7e-93f7-9e9d161be209 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1658.314352] env[67169]: DEBUG nova.network.neutron [req-9b3290ca-87e8-4ea8-9809-75ab7b4a4bae req-f225ee5c-6fe3-47e1-9c59-0076f8bd27e8 service nova] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Updated VIF entry in instance network info cache for port 336c8a2d-ede1-4b7e-93f7-9e9d161be209. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1658.314713] env[67169]: DEBUG nova.network.neutron [req-9b3290ca-87e8-4ea8-9809-75ab7b4a4bae req-f225ee5c-6fe3-47e1-9c59-0076f8bd27e8 service nova] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Updating instance_info_cache with network_info: [{"id": "336c8a2d-ede1-4b7e-93f7-9e9d161be209", "address": "fa:16:3e:23:6e:ab", "network": {"id": "b61be2af-391d-401b-8e5f-b343a30fd98f", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-771864181-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "37492586ebba45c7893955c459766b5d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56834f67-27a8-43dc-bbc6-a74aaa08959b", "external-id": "nsx-vlan-transportzone-949", "segmentation_id": 949, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap336c8a2d-ed", "ovs_interfaceid": "336c8a2d-ede1-4b7e-93f7-9e9d161be209", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1658.323893] env[67169]: DEBUG oslo_concurrency.lockutils [req-9b3290ca-87e8-4ea8-9809-75ab7b4a4bae req-f225ee5c-6fe3-47e1-9c59-0076f8bd27e8 service nova] Releasing lock "refresh_cache-aedbfde6-26e1-410d-a311-e2c344f65062" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1667.158615] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "aedbfde6-26e1-410d-a311-e2c344f65062" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1674.661847] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1677.660519] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1677.660908] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1677.660908] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1677.682611] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.682827] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.682959] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.683143] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.683312] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.683470] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.683627] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.683783] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.683938] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.684112] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.684266] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1679.658623] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1679.658997] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1679.659087] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1680.653564] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1680.658237] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1682.658600] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1682.658989] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1682.670867] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.671099] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1682.671273] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1682.671430] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1682.672547] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15027912-ef2f-46bf-8b60-92f960a3c95f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.681184] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2296f922-2aa3-4508-a690-17dca970c9d3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.694663] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4875f8ed-0144-437d-b91e-e73cb9436ad3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.700735] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84205a1e-3116-4f60-90ad-ca5d5a4c25e6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.729191] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181046MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1682.729337] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.729528] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1682.800631] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7817b417-599c-4619-8bd3-28d2e8236b9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.800790] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.800922] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.801059] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 74ea66f0-391c-437b-8aee-f784528d7963 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.801186] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.801303] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.801418] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.801533] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.801645] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.801757] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1682.812248] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1682.822074] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1682.831186] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1682.831404] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1682.831551] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1682.973400] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-662a1f0c-652a-41b3-9e37-1e4289c81f1b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.981051] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c309896-4063-462e-9dcc-0957b67dff25 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.010187] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-852d1f2f-826b-477f-93fd-c752a4830dfa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.017233] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9dd3d0d-78dd-40ef-b3da-12954f945b5d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.030223] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1683.038631] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1683.054831] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1683.055241] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.325s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1687.056075] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1699.420884] env[67169]: WARNING oslo_vmware.rw_handles [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1699.420884] env[67169]: ERROR oslo_vmware.rw_handles [ 1699.421641] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/40fd0b20-2302-49af-9a4a-b6c4d2e52f85/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1699.423553] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1699.423813] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Copying Virtual Disk [datastore2] vmware_temp/40fd0b20-2302-49af-9a4a-b6c4d2e52f85/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/40fd0b20-2302-49af-9a4a-b6c4d2e52f85/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1699.424539] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-69847aac-fc79-4587-a0e8-1f1f9048ff13 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.432974] env[67169]: DEBUG oslo_vmware.api [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for the task: (returnval){ [ 1699.432974] env[67169]: value = "task-2819224" [ 1699.432974] env[67169]: _type = "Task" [ 1699.432974] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1699.441950] env[67169]: DEBUG oslo_vmware.api [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': task-2819224, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1699.943976] env[67169]: DEBUG oslo_vmware.exceptions [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1699.944275] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1699.944846] env[67169]: ERROR nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1699.944846] env[67169]: Faults: ['InvalidArgument'] [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Traceback (most recent call last): [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] yield resources [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] self.driver.spawn(context, instance, image_meta, [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] self._fetch_image_if_missing(context, vi) [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] image_cache(vi, tmp_image_ds_loc) [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] vm_util.copy_virtual_disk( [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] session._wait_for_task(vmdk_copy_task) [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] return self.wait_for_task(task_ref) [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] return evt.wait() [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] result = hub.switch() [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] return self.greenlet.switch() [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] self.f(*self.args, **self.kw) [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] raise exceptions.translate_fault(task_info.error) [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Faults: ['InvalidArgument'] [ 1699.944846] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] [ 1699.946014] env[67169]: INFO nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Terminating instance [ 1699.946768] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1699.946984] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1699.947238] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e10761e1-0f8f-4e85-a712-1ad9fe8a472b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.949816] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1699.949999] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1699.950714] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b0cf85f-21bf-4f4b-9e25-38b406f69bb7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.957069] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1699.957298] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5b7ab24d-9eab-49e7-9fe8-b446fb95bc9d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.959390] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1699.959561] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1699.960498] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b408259f-b43d-4a7e-9012-16fcb4d225ef {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.965501] env[67169]: DEBUG oslo_vmware.api [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Waiting for the task: (returnval){ [ 1699.965501] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]520d1b1d-e941-d432-3d8e-14d29fdb1897" [ 1699.965501] env[67169]: _type = "Task" [ 1699.965501] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1699.972228] env[67169]: DEBUG oslo_vmware.api [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]520d1b1d-e941-d432-3d8e-14d29fdb1897, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1700.211546] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1700.211776] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1700.211890] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Deleting the datastore file [datastore2] 7817b417-599c-4619-8bd3-28d2e8236b9f {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1700.212199] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c1270ca1-51f8-489d-9e6a-abf2a5698dcd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.218718] env[67169]: DEBUG oslo_vmware.api [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for the task: (returnval){ [ 1700.218718] env[67169]: value = "task-2819226" [ 1700.218718] env[67169]: _type = "Task" [ 1700.218718] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1700.226024] env[67169]: DEBUG oslo_vmware.api [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': task-2819226, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1700.475667] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1700.476064] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Creating directory with path [datastore2] vmware_temp/1d45171f-e527-4a78-889c-2eb39dd0fc58/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1700.476371] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b36f2ad4-5cbb-4fa3-9492-e0088ccac3ed {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.487671] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Created directory with path [datastore2] vmware_temp/1d45171f-e527-4a78-889c-2eb39dd0fc58/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1700.487932] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Fetch image to [datastore2] vmware_temp/1d45171f-e527-4a78-889c-2eb39dd0fc58/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1700.488155] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/1d45171f-e527-4a78-889c-2eb39dd0fc58/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1700.488896] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f2368ee-9fda-4177-8e60-c9dd609f7cba {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.495604] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f40ce28-9a0f-490f-844e-121cc721b871 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.504405] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a263391-7c65-4568-ae28-4d9dbcae482d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.534670] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4fb773d-d5db-4cf8-93ba-0a234ca402f0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.540554] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-64cc5962-707b-4102-9ad2-b4ab944d01f3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.561394] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1700.664489] env[67169]: DEBUG oslo_vmware.rw_handles [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d45171f-e527-4a78-889c-2eb39dd0fc58/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1700.726197] env[67169]: DEBUG oslo_vmware.rw_handles [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1700.726366] env[67169]: DEBUG oslo_vmware.rw_handles [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d45171f-e527-4a78-889c-2eb39dd0fc58/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1700.730444] env[67169]: DEBUG oslo_vmware.api [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': task-2819226, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066985} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1700.730682] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1700.730865] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1700.731050] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1700.731232] env[67169]: INFO nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Took 0.78 seconds to destroy the instance on the hypervisor. [ 1700.733369] env[67169]: DEBUG nova.compute.claims [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1700.733539] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1700.733765] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1700.921402] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df9f5b70-22d7-4618-b7f9-8233160b399a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.928567] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9467a406-b261-41a8-928e-d53cd51c969d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.961025] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4af46537-138c-4289-bc9d-efcba47f62cc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.969372] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9be08f15-cc91-4f1d-9c21-50c1fb74633f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.982597] env[67169]: DEBUG nova.compute.provider_tree [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1700.992826] env[67169]: DEBUG nova.scheduler.client.report [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1701.005974] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.272s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1701.006591] env[67169]: ERROR nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1701.006591] env[67169]: Faults: ['InvalidArgument'] [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Traceback (most recent call last): [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] self.driver.spawn(context, instance, image_meta, [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] self._fetch_image_if_missing(context, vi) [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] image_cache(vi, tmp_image_ds_loc) [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] vm_util.copy_virtual_disk( [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] session._wait_for_task(vmdk_copy_task) [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] return self.wait_for_task(task_ref) [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] return evt.wait() [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] result = hub.switch() [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] return self.greenlet.switch() [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] self.f(*self.args, **self.kw) [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] raise exceptions.translate_fault(task_info.error) [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Faults: ['InvalidArgument'] [ 1701.006591] env[67169]: ERROR nova.compute.manager [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] [ 1701.007462] env[67169]: DEBUG nova.compute.utils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1701.008822] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Build of instance 7817b417-599c-4619-8bd3-28d2e8236b9f was re-scheduled: A specified parameter was not correct: fileType [ 1701.008822] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1701.009202] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1701.009408] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1701.009586] env[67169]: DEBUG nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1701.009757] env[67169]: DEBUG nova.network.neutron [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1701.277859] env[67169]: DEBUG nova.network.neutron [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1701.292029] env[67169]: INFO nova.compute.manager [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Took 0.28 seconds to deallocate network for instance. [ 1701.388724] env[67169]: INFO nova.scheduler.client.report [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Deleted allocations for instance 7817b417-599c-4619-8bd3-28d2e8236b9f [ 1701.407996] env[67169]: DEBUG oslo_concurrency.lockutils [None req-32caf906-c820-4c4a-a535-1175960d4c45 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "7817b417-599c-4619-8bd3-28d2e8236b9f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 623.215s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1701.409220] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "7817b417-599c-4619-8bd3-28d2e8236b9f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 426.667s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1701.409385] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "7817b417-599c-4619-8bd3-28d2e8236b9f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1701.409587] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "7817b417-599c-4619-8bd3-28d2e8236b9f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1701.409755] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "7817b417-599c-4619-8bd3-28d2e8236b9f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1701.411653] env[67169]: INFO nova.compute.manager [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Terminating instance [ 1701.413224] env[67169]: DEBUG nova.compute.manager [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1701.413417] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1701.413864] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5dc35a7d-d9c4-4170-98dd-76e38472b487 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.420717] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 3d636f4c-c042-428f-be5d-1fbf20c61f0a] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1701.426769] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6df62aee-543a-4db8-9bab-57d6294adf6b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.445089] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 3d636f4c-c042-428f-be5d-1fbf20c61f0a] Instance disappeared before build. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1701.456201] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7817b417-599c-4619-8bd3-28d2e8236b9f could not be found. [ 1701.456395] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1701.456571] env[67169]: INFO nova.compute.manager [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1701.456807] env[67169]: DEBUG oslo.service.loopingcall [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1701.458927] env[67169]: DEBUG nova.compute.manager [-] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1701.459062] env[67169]: DEBUG nova.network.neutron [-] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1701.469244] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "3d636f4c-c042-428f-be5d-1fbf20c61f0a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.285s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1701.479946] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1701.486732] env[67169]: DEBUG nova.network.neutron [-] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1701.499297] env[67169]: INFO nova.compute.manager [-] [instance: 7817b417-599c-4619-8bd3-28d2e8236b9f] Took 0.04 seconds to deallocate network for instance. [ 1701.541398] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1701.541646] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1701.543941] env[67169]: INFO nova.compute.claims [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1701.598530] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1c5ca8ef-6753-4abe-bf14-c238954b63af tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "7817b417-599c-4619-8bd3-28d2e8236b9f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.189s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1701.732980] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6b55dac-6562-4b50-a999-d789b95ce1a3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.741185] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-518720ef-163e-4abe-94a4-b0133c77a756 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.772875] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b93951f7-30f6-468c-89e4-0d2146a48466 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.780022] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b4c05d2-ae01-4b9f-8f77-e9b3581dd16e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.793882] env[67169]: DEBUG nova.compute.provider_tree [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1701.803210] env[67169]: DEBUG nova.scheduler.client.report [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1701.818788] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.277s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1701.819292] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1701.853040] env[67169]: DEBUG nova.compute.utils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1701.854870] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1701.855206] env[67169]: DEBUG nova.network.neutron [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1701.864894] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1701.934617] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1701.944336] env[67169]: DEBUG nova.policy [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97a8ba7ba7034f73b7597555339dd1e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e3ba2643d1b4703b58c458e842cb13e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1701.962108] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1701.962372] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1701.962543] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1701.962733] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1701.962881] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1701.963087] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1701.963330] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1701.963509] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1701.964202] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1701.964433] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1701.964895] env[67169]: DEBUG nova.virt.hardware [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1701.966126] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2740d246-60e1-4de6-a482-aee5f99cbbd7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1701.975299] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c013635-687a-4a76-856a-a58cc4dad09a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1702.309195] env[67169]: DEBUG nova.network.neutron [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Successfully created port: a8fc2376-e290-477c-96d8-3b94ead2fcd2 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1703.371823] env[67169]: DEBUG nova.network.neutron [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Successfully updated port: a8fc2376-e290-477c-96d8-3b94ead2fcd2 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1703.387629] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquiring lock "refresh_cache-c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1703.387767] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquired lock "refresh_cache-c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1703.387912] env[67169]: DEBUG nova.network.neutron [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1703.420280] env[67169]: DEBUG nova.compute.manager [req-57c5331a-e1aa-4b64-b99e-882633faecb0 req-d72f3a21-c694-4268-9cc4-9f049a1ce755 service nova] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Received event network-vif-plugged-a8fc2376-e290-477c-96d8-3b94ead2fcd2 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1703.420495] env[67169]: DEBUG oslo_concurrency.lockutils [req-57c5331a-e1aa-4b64-b99e-882633faecb0 req-d72f3a21-c694-4268-9cc4-9f049a1ce755 service nova] Acquiring lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.420696] env[67169]: DEBUG oslo_concurrency.lockutils [req-57c5331a-e1aa-4b64-b99e-882633faecb0 req-d72f3a21-c694-4268-9cc4-9f049a1ce755 service nova] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1703.420863] env[67169]: DEBUG oslo_concurrency.lockutils [req-57c5331a-e1aa-4b64-b99e-882633faecb0 req-d72f3a21-c694-4268-9cc4-9f049a1ce755 service nova] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1703.421039] env[67169]: DEBUG nova.compute.manager [req-57c5331a-e1aa-4b64-b99e-882633faecb0 req-d72f3a21-c694-4268-9cc4-9f049a1ce755 service nova] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] No waiting events found dispatching network-vif-plugged-a8fc2376-e290-477c-96d8-3b94ead2fcd2 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1703.421219] env[67169]: WARNING nova.compute.manager [req-57c5331a-e1aa-4b64-b99e-882633faecb0 req-d72f3a21-c694-4268-9cc4-9f049a1ce755 service nova] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Received unexpected event network-vif-plugged-a8fc2376-e290-477c-96d8-3b94ead2fcd2 for instance with vm_state building and task_state spawning. [ 1703.451229] env[67169]: DEBUG nova.network.neutron [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1703.619731] env[67169]: DEBUG nova.network.neutron [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Updating instance_info_cache with network_info: [{"id": "a8fc2376-e290-477c-96d8-3b94ead2fcd2", "address": "fa:16:3e:8a:c4:01", "network": {"id": "1a63e5d0-4019-4a72-881f-7cbce08a568e", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-900087547-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6e3ba2643d1b4703b58c458e842cb13e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a69ed1dd-213a-4e30-992a-466735188bf6", "external-id": "nsx-vlan-transportzone-102", "segmentation_id": 102, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa8fc2376-e2", "ovs_interfaceid": "a8fc2376-e290-477c-96d8-3b94ead2fcd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1703.632826] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Releasing lock "refresh_cache-c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1703.633136] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Instance network_info: |[{"id": "a8fc2376-e290-477c-96d8-3b94ead2fcd2", "address": "fa:16:3e:8a:c4:01", "network": {"id": "1a63e5d0-4019-4a72-881f-7cbce08a568e", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-900087547-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6e3ba2643d1b4703b58c458e842cb13e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a69ed1dd-213a-4e30-992a-466735188bf6", "external-id": "nsx-vlan-transportzone-102", "segmentation_id": 102, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa8fc2376-e2", "ovs_interfaceid": "a8fc2376-e290-477c-96d8-3b94ead2fcd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1703.633531] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8a:c4:01', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a69ed1dd-213a-4e30-992a-466735188bf6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a8fc2376-e290-477c-96d8-3b94ead2fcd2', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1703.641391] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Creating folder: Project (6e3ba2643d1b4703b58c458e842cb13e). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1703.641898] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e2050952-514e-40b6-9949-4d34d99bb573 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.652388] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Created folder: Project (6e3ba2643d1b4703b58c458e842cb13e) in parent group-v566843. [ 1703.652574] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Creating folder: Instances. Parent ref: group-v566938. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1703.652796] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7963bebf-a067-4fac-b766-28258fa65920 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.661291] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Created folder: Instances in parent group-v566938. [ 1703.661524] env[67169]: DEBUG oslo.service.loopingcall [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1703.661701] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1703.661891] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9d330dc5-9e98-4d7a-975e-07bd8db87814 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.680503] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1703.680503] env[67169]: value = "task-2819229" [ 1703.680503] env[67169]: _type = "Task" [ 1703.680503] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1703.687756] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819229, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1704.190878] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819229, 'name': CreateVM_Task} progress is 99%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1704.691545] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819229, 'name': CreateVM_Task} progress is 99%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1705.192283] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819229, 'name': CreateVM_Task, 'duration_secs': 1.269455} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1705.192478] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1705.193135] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1705.193309] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1705.193692] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1705.193946] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-700a63f5-8711-43b5-985f-47dfabeb5a17 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1705.198131] env[67169]: DEBUG oslo_vmware.api [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Waiting for the task: (returnval){ [ 1705.198131] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]529068fe-7f01-af2a-6a24-f9d60b6df16f" [ 1705.198131] env[67169]: _type = "Task" [ 1705.198131] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1705.205238] env[67169]: DEBUG oslo_vmware.api [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]529068fe-7f01-af2a-6a24-f9d60b6df16f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1705.452625] env[67169]: DEBUG nova.compute.manager [req-6b0da7b8-ee9c-479a-872d-29df1c9f109e req-90fadf7a-5bda-4862-a868-cd8657d1ab6b service nova] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Received event network-changed-a8fc2376-e290-477c-96d8-3b94ead2fcd2 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1705.452779] env[67169]: DEBUG nova.compute.manager [req-6b0da7b8-ee9c-479a-872d-29df1c9f109e req-90fadf7a-5bda-4862-a868-cd8657d1ab6b service nova] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Refreshing instance network info cache due to event network-changed-a8fc2376-e290-477c-96d8-3b94ead2fcd2. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1705.452999] env[67169]: DEBUG oslo_concurrency.lockutils [req-6b0da7b8-ee9c-479a-872d-29df1c9f109e req-90fadf7a-5bda-4862-a868-cd8657d1ab6b service nova] Acquiring lock "refresh_cache-c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1705.453162] env[67169]: DEBUG oslo_concurrency.lockutils [req-6b0da7b8-ee9c-479a-872d-29df1c9f109e req-90fadf7a-5bda-4862-a868-cd8657d1ab6b service nova] Acquired lock "refresh_cache-c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1705.453453] env[67169]: DEBUG nova.network.neutron [req-6b0da7b8-ee9c-479a-872d-29df1c9f109e req-90fadf7a-5bda-4862-a868-cd8657d1ab6b service nova] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Refreshing network info cache for port a8fc2376-e290-477c-96d8-3b94ead2fcd2 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1705.708469] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1705.708779] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1705.708892] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1705.809815] env[67169]: DEBUG nova.network.neutron [req-6b0da7b8-ee9c-479a-872d-29df1c9f109e req-90fadf7a-5bda-4862-a868-cd8657d1ab6b service nova] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Updated VIF entry in instance network info cache for port a8fc2376-e290-477c-96d8-3b94ead2fcd2. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1705.810187] env[67169]: DEBUG nova.network.neutron [req-6b0da7b8-ee9c-479a-872d-29df1c9f109e req-90fadf7a-5bda-4862-a868-cd8657d1ab6b service nova] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Updating instance_info_cache with network_info: [{"id": "a8fc2376-e290-477c-96d8-3b94ead2fcd2", "address": "fa:16:3e:8a:c4:01", "network": {"id": "1a63e5d0-4019-4a72-881f-7cbce08a568e", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-900087547-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6e3ba2643d1b4703b58c458e842cb13e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a69ed1dd-213a-4e30-992a-466735188bf6", "external-id": "nsx-vlan-transportzone-102", "segmentation_id": 102, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa8fc2376-e2", "ovs_interfaceid": "a8fc2376-e290-477c-96d8-3b94ead2fcd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1705.819384] env[67169]: DEBUG oslo_concurrency.lockutils [req-6b0da7b8-ee9c-479a-872d-29df1c9f109e req-90fadf7a-5bda-4862-a868-cd8657d1ab6b service nova] Releasing lock "refresh_cache-c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1714.124962] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "115d6c00-4259-4e87-aa00-90b576a63535" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1714.125301] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "115d6c00-4259-4e87-aa00-90b576a63535" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1723.163794] env[67169]: DEBUG oslo_concurrency.lockutils [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquiring lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1736.659026] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.659869] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1738.666368] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1738.666704] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1738.666704] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1738.688365] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.688533] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.688667] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.688792] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.688912] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.689042] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.689162] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.689282] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.689406] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.689521] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1738.689638] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1740.659717] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1740.660035] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1740.661245] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1741.653898] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1742.659778] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1742.660095] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1743.658674] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1743.673333] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1743.673622] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1743.673782] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1743.673975] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1743.675272] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afe970f0-1f47-4299-b255-286da98135fa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.684989] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5be4af16-1256-47ec-95fa-7525a773da3f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.699196] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e651953-80ad-4a6c-b5e1-1f9b303e43c2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.705707] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a00d87fe-72c5-4e3c-bd10-bce480f2aa3f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.735924] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181027MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1743.736092] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1743.736283] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1743.806882] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 883a792f-ae72-4475-8592-3076c2c2c2ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.807058] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.807198] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 74ea66f0-391c-437b-8aee-f784528d7963 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.807325] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.807448] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.807568] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.807684] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.807800] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.807914] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.808043] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1743.819563] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1743.830050] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1743.840787] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 115d6c00-4259-4e87-aa00-90b576a63535 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1743.841012] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1743.841167] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1743.988118] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ea33f08-6195-46a6-95a2-204cf913db42 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1743.995556] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7051309d-a79b-40fc-a5c9-7348d6a237e1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.026855] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a61ef332-0304-415f-ae28-25e268d2d578 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.034357] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c539527-1d4c-46fa-82d6-c4c6e6a95d82 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.048161] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1744.056649] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1744.069880] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1744.070086] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.334s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1744.070310] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1744.070461] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1744.078063] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] There are 0 instances to clean {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1747.072833] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1747.094881] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1750.693925] env[67169]: WARNING oslo_vmware.rw_handles [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1750.693925] env[67169]: ERROR oslo_vmware.rw_handles [ 1750.694801] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/1d45171f-e527-4a78-889c-2eb39dd0fc58/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1750.696627] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1750.696898] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Copying Virtual Disk [datastore2] vmware_temp/1d45171f-e527-4a78-889c-2eb39dd0fc58/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/1d45171f-e527-4a78-889c-2eb39dd0fc58/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1750.697228] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-78458a26-1bde-4842-a45b-cf3b7d0974e8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.705145] env[67169]: DEBUG oslo_vmware.api [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Waiting for the task: (returnval){ [ 1750.705145] env[67169]: value = "task-2819230" [ 1750.705145] env[67169]: _type = "Task" [ 1750.705145] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1750.712811] env[67169]: DEBUG oslo_vmware.api [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Task: {'id': task-2819230, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1751.215495] env[67169]: DEBUG oslo_vmware.exceptions [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1751.215768] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1751.216346] env[67169]: ERROR nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1751.216346] env[67169]: Faults: ['InvalidArgument'] [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Traceback (most recent call last): [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] yield resources [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] self.driver.spawn(context, instance, image_meta, [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] self._fetch_image_if_missing(context, vi) [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] image_cache(vi, tmp_image_ds_loc) [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] vm_util.copy_virtual_disk( [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] session._wait_for_task(vmdk_copy_task) [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] return self.wait_for_task(task_ref) [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] return evt.wait() [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] result = hub.switch() [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] return self.greenlet.switch() [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] self.f(*self.args, **self.kw) [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] raise exceptions.translate_fault(task_info.error) [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Faults: ['InvalidArgument'] [ 1751.216346] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] [ 1751.217418] env[67169]: INFO nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Terminating instance [ 1751.218228] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1751.218519] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1751.218796] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-141701fe-3dca-4e0e-9cae-15c82d5de5d6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.221244] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1751.221454] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1751.222231] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb2f406a-3598-47e5-a2bc-3c7298d1d92e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.229074] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1751.229277] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0c87f25c-23ed-40b5-bee4-8817f8166613 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.231476] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1751.231647] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1751.232601] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dab3b148-914a-42a9-a751-f7b92568fd7b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.237206] env[67169]: DEBUG oslo_vmware.api [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Waiting for the task: (returnval){ [ 1751.237206] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52c6e813-bee6-1372-64cd-8262ed6c8053" [ 1751.237206] env[67169]: _type = "Task" [ 1751.237206] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1751.251508] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1751.251731] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Creating directory with path [datastore2] vmware_temp/2ff600f3-188a-45f1-959e-bbadda6d695c/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1751.251940] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b4e3a02c-f569-49cd-aaaf-7c0e2bcdb5ee {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.272535] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Created directory with path [datastore2] vmware_temp/2ff600f3-188a-45f1-959e-bbadda6d695c/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1751.272725] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Fetch image to [datastore2] vmware_temp/2ff600f3-188a-45f1-959e-bbadda6d695c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1751.272891] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/2ff600f3-188a-45f1-959e-bbadda6d695c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1751.273662] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5eecb53-681d-4179-8aa1-c39ff1c05b17 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.280364] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65995db0-ed77-4e0d-be57-60c60d083d92 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.289276] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acbbea72-d149-4e31-a752-58ea6259a503 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.321637] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55f0d33d-b477-4eb9-ac35-1e76091db650 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.324258] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1751.324447] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1751.324619] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Deleting the datastore file [datastore2] 883a792f-ae72-4475-8592-3076c2c2c2ae {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1751.324848] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ea268c25-f230-4a7c-8fb2-a4538d736b2a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.329471] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3854591e-c76f-4dfd-ad9c-4cbed99e68fd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.332212] env[67169]: DEBUG oslo_vmware.api [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Waiting for the task: (returnval){ [ 1751.332212] env[67169]: value = "task-2819232" [ 1751.332212] env[67169]: _type = "Task" [ 1751.332212] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1751.339383] env[67169]: DEBUG oslo_vmware.api [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Task: {'id': task-2819232, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1751.350975] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1751.404407] env[67169]: DEBUG oslo_vmware.rw_handles [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2ff600f3-188a-45f1-959e-bbadda6d695c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1751.465395] env[67169]: DEBUG oslo_vmware.rw_handles [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1751.465579] env[67169]: DEBUG oslo_vmware.rw_handles [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2ff600f3-188a-45f1-959e-bbadda6d695c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1751.842593] env[67169]: DEBUG oslo_vmware.api [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Task: {'id': task-2819232, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.107629} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1751.842993] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1751.843065] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1751.843200] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1751.843371] env[67169]: INFO nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1751.845640] env[67169]: DEBUG nova.compute.claims [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1751.845813] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1751.846063] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1752.071924] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65db2a6e-2471-4af8-b4d2-4b06d0919172 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.079716] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc9e5301-9862-48c2-bcfc-e584bbcb2505 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.110131] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1293c828-d96d-450a-b82c-e5ffd70c5cbe {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.118527] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-118f58da-1bb3-4358-8174-128d7da6e12f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.135163] env[67169]: DEBUG nova.compute.provider_tree [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1752.144052] env[67169]: DEBUG nova.scheduler.client.report [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1752.161170] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.315s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1752.161760] env[67169]: ERROR nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1752.161760] env[67169]: Faults: ['InvalidArgument'] [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Traceback (most recent call last): [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] self.driver.spawn(context, instance, image_meta, [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] self._fetch_image_if_missing(context, vi) [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] image_cache(vi, tmp_image_ds_loc) [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] vm_util.copy_virtual_disk( [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] session._wait_for_task(vmdk_copy_task) [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] return self.wait_for_task(task_ref) [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] return evt.wait() [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] result = hub.switch() [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] return self.greenlet.switch() [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] self.f(*self.args, **self.kw) [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] raise exceptions.translate_fault(task_info.error) [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Faults: ['InvalidArgument'] [ 1752.161760] env[67169]: ERROR nova.compute.manager [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] [ 1752.162742] env[67169]: DEBUG nova.compute.utils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1752.164193] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Build of instance 883a792f-ae72-4475-8592-3076c2c2c2ae was re-scheduled: A specified parameter was not correct: fileType [ 1752.164193] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1752.164581] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1752.164753] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1752.164926] env[67169]: DEBUG nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1752.165101] env[67169]: DEBUG nova.network.neutron [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1752.690522] env[67169]: DEBUG nova.network.neutron [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1752.704110] env[67169]: INFO nova.compute.manager [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Took 0.54 seconds to deallocate network for instance. [ 1752.802458] env[67169]: INFO nova.scheduler.client.report [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Deleted allocations for instance 883a792f-ae72-4475-8592-3076c2c2c2ae [ 1752.823104] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4835c22e-883d-43d2-82df-7b3fc294d6b4 tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "883a792f-ae72-4475-8592-3076c2c2c2ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 626.579s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1752.824475] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "883a792f-ae72-4475-8592-3076c2c2c2ae" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 430.941s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1752.824614] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Acquiring lock "883a792f-ae72-4475-8592-3076c2c2c2ae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1752.824742] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "883a792f-ae72-4475-8592-3076c2c2c2ae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1752.825188] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "883a792f-ae72-4475-8592-3076c2c2c2ae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1752.827283] env[67169]: INFO nova.compute.manager [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Terminating instance [ 1752.829063] env[67169]: DEBUG nova.compute.manager [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1752.829266] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1752.829770] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cc66c56a-fe88-4f06-814f-efe6b8facc3a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.840486] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8840e30-4a67-4404-91e7-01b16a4b0a05 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1752.851125] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1752.872993] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 883a792f-ae72-4475-8592-3076c2c2c2ae could not be found. [ 1752.873176] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1752.873367] env[67169]: INFO nova.compute.manager [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1752.873617] env[67169]: DEBUG oslo.service.loopingcall [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1752.873885] env[67169]: DEBUG nova.compute.manager [-] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1752.874009] env[67169]: DEBUG nova.network.neutron [-] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1752.906608] env[67169]: DEBUG nova.network.neutron [-] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1752.910787] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1752.911049] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1752.912641] env[67169]: INFO nova.compute.claims [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1752.916112] env[67169]: INFO nova.compute.manager [-] [instance: 883a792f-ae72-4475-8592-3076c2c2c2ae] Took 0.04 seconds to deallocate network for instance. [ 1753.014474] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faad8e12-a376-4beb-b3bb-9456f3e9bcbb tempest-ImagesOneServerNegativeTestJSON-20346663 tempest-ImagesOneServerNegativeTestJSON-20346663-project-member] Lock "883a792f-ae72-4475-8592-3076c2c2c2ae" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.190s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1753.153155] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fccdfec6-ee03-4908-b601-73e8ac40dddd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1753.161091] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9348aee-c6f9-435d-b3d6-6db39539ef75 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1753.191147] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cb2d8da-fc07-4054-b03e-c1f48de18143 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1753.198991] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d95d501c-3fd6-42cd-81a8-85c817d630b0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1753.211974] env[67169]: DEBUG nova.compute.provider_tree [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1753.221857] env[67169]: DEBUG nova.scheduler.client.report [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1753.238511] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1753.239045] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1753.275300] env[67169]: DEBUG nova.compute.utils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1753.276489] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1753.276658] env[67169]: DEBUG nova.network.neutron [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1753.286752] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1753.354928] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1753.368489] env[67169]: DEBUG nova.policy [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ce57069286b34b5da298e9b01f4bd39e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d3275803e654637b85c8f15583e2e25', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1753.382963] env[67169]: DEBUG nova.virt.hardware [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1753.383415] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0650546e-c211-4671-8ad0-9c81dd773b8c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1753.391328] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7da03cad-b454-4c07-bc2b-c92663002ad7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1753.659016] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1753.659220] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances with incomplete migration {{(pid=67169) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1753.741305] env[67169]: DEBUG nova.network.neutron [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Successfully created port: 5844372a-1732-403c-98a9-7b3161c818a4 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1754.733189] env[67169]: DEBUG nova.compute.manager [req-ef87fc82-567a-4155-b0e7-99309a9ef4a7 req-76309830-f521-45b3-90ea-8fea7829e848 service nova] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Received event network-vif-plugged-5844372a-1732-403c-98a9-7b3161c818a4 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1754.733475] env[67169]: DEBUG oslo_concurrency.lockutils [req-ef87fc82-567a-4155-b0e7-99309a9ef4a7 req-76309830-f521-45b3-90ea-8fea7829e848 service nova] Acquiring lock "9435574d-2128-4b20-ba92-ee2aba37d33b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1754.733702] env[67169]: DEBUG oslo_concurrency.lockutils [req-ef87fc82-567a-4155-b0e7-99309a9ef4a7 req-76309830-f521-45b3-90ea-8fea7829e848 service nova] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1754.733939] env[67169]: DEBUG oslo_concurrency.lockutils [req-ef87fc82-567a-4155-b0e7-99309a9ef4a7 req-76309830-f521-45b3-90ea-8fea7829e848 service nova] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1754.734231] env[67169]: DEBUG nova.compute.manager [req-ef87fc82-567a-4155-b0e7-99309a9ef4a7 req-76309830-f521-45b3-90ea-8fea7829e848 service nova] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] No waiting events found dispatching network-vif-plugged-5844372a-1732-403c-98a9-7b3161c818a4 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1754.734448] env[67169]: WARNING nova.compute.manager [req-ef87fc82-567a-4155-b0e7-99309a9ef4a7 req-76309830-f521-45b3-90ea-8fea7829e848 service nova] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Received unexpected event network-vif-plugged-5844372a-1732-403c-98a9-7b3161c818a4 for instance with vm_state building and task_state spawning. [ 1754.736988] env[67169]: DEBUG nova.network.neutron [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Successfully updated port: 5844372a-1732-403c-98a9-7b3161c818a4 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1754.750295] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "refresh_cache-9435574d-2128-4b20-ba92-ee2aba37d33b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1754.750439] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired lock "refresh_cache-9435574d-2128-4b20-ba92-ee2aba37d33b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1754.750583] env[67169]: DEBUG nova.network.neutron [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1754.798205] env[67169]: DEBUG nova.network.neutron [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1754.996561] env[67169]: DEBUG nova.network.neutron [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Updating instance_info_cache with network_info: [{"id": "5844372a-1732-403c-98a9-7b3161c818a4", "address": "fa:16:3e:29:d2:9d", "network": {"id": "4e24bc87-3a15-4231-a607-f93bb9122dca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-93817792-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3275803e654637b85c8f15583e2e25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5844372a-17", "ovs_interfaceid": "5844372a-1732-403c-98a9-7b3161c818a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1755.010323] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Releasing lock "refresh_cache-9435574d-2128-4b20-ba92-ee2aba37d33b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1755.010777] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Instance network_info: |[{"id": "5844372a-1732-403c-98a9-7b3161c818a4", "address": "fa:16:3e:29:d2:9d", "network": {"id": "4e24bc87-3a15-4231-a607-f93bb9122dca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-93817792-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3275803e654637b85c8f15583e2e25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5844372a-17", "ovs_interfaceid": "5844372a-1732-403c-98a9-7b3161c818a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1755.011064] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:29:d2:9d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '298bb8ef-4765-494c-b157-7a349218bd1e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5844372a-1732-403c-98a9-7b3161c818a4', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1755.018731] env[67169]: DEBUG oslo.service.loopingcall [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1755.019308] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1755.019570] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0eb27051-d101-4e8f-be3c-eab825664a65 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.039685] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1755.039685] env[67169]: value = "task-2819233" [ 1755.039685] env[67169]: _type = "Task" [ 1755.039685] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1755.047258] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819233, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1755.550252] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819233, 'name': CreateVM_Task, 'duration_secs': 0.291071} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1755.550436] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1755.551126] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1755.551304] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1755.551627] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1755.551880] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2347e769-970d-4af0-86f9-0ded18cc92ed {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.556164] env[67169]: DEBUG oslo_vmware.api [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 1755.556164] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5263fbe9-e524-ab4b-eabc-e24d16ae4c73" [ 1755.556164] env[67169]: _type = "Task" [ 1755.556164] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1755.563545] env[67169]: DEBUG oslo_vmware.api [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5263fbe9-e524-ab4b-eabc-e24d16ae4c73, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1756.066712] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1756.067055] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1756.067239] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1756.758894] env[67169]: DEBUG nova.compute.manager [req-a741be56-46df-446b-a180-ab8966227c21 req-63e285a2-1654-4823-8454-be4162898857 service nova] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Received event network-changed-5844372a-1732-403c-98a9-7b3161c818a4 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1756.759132] env[67169]: DEBUG nova.compute.manager [req-a741be56-46df-446b-a180-ab8966227c21 req-63e285a2-1654-4823-8454-be4162898857 service nova] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Refreshing instance network info cache due to event network-changed-5844372a-1732-403c-98a9-7b3161c818a4. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1756.759331] env[67169]: DEBUG oslo_concurrency.lockutils [req-a741be56-46df-446b-a180-ab8966227c21 req-63e285a2-1654-4823-8454-be4162898857 service nova] Acquiring lock "refresh_cache-9435574d-2128-4b20-ba92-ee2aba37d33b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1756.759484] env[67169]: DEBUG oslo_concurrency.lockutils [req-a741be56-46df-446b-a180-ab8966227c21 req-63e285a2-1654-4823-8454-be4162898857 service nova] Acquired lock "refresh_cache-9435574d-2128-4b20-ba92-ee2aba37d33b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1756.759648] env[67169]: DEBUG nova.network.neutron [req-a741be56-46df-446b-a180-ab8966227c21 req-63e285a2-1654-4823-8454-be4162898857 service nova] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Refreshing network info cache for port 5844372a-1732-403c-98a9-7b3161c818a4 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1757.008308] env[67169]: DEBUG nova.network.neutron [req-a741be56-46df-446b-a180-ab8966227c21 req-63e285a2-1654-4823-8454-be4162898857 service nova] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Updated VIF entry in instance network info cache for port 5844372a-1732-403c-98a9-7b3161c818a4. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1757.008743] env[67169]: DEBUG nova.network.neutron [req-a741be56-46df-446b-a180-ab8966227c21 req-63e285a2-1654-4823-8454-be4162898857 service nova] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Updating instance_info_cache with network_info: [{"id": "5844372a-1732-403c-98a9-7b3161c818a4", "address": "fa:16:3e:29:d2:9d", "network": {"id": "4e24bc87-3a15-4231-a607-f93bb9122dca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-93817792-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3275803e654637b85c8f15583e2e25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5844372a-17", "ovs_interfaceid": "5844372a-1732-403c-98a9-7b3161c818a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1757.018079] env[67169]: DEBUG oslo_concurrency.lockutils [req-a741be56-46df-446b-a180-ab8966227c21 req-63e285a2-1654-4823-8454-be4162898857 service nova] Releasing lock "refresh_cache-9435574d-2128-4b20-ba92-ee2aba37d33b" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1768.863595] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "220daf5b-b4fd-49b0-9098-c1f846d6e552" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.864127] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "220daf5b-b4fd-49b0-9098-c1f846d6e552" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1770.225676] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "9435574d-2128-4b20-ba92-ee2aba37d33b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1796.668452] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1798.659542] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1798.659970] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1798.659970] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1798.683351] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.683654] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.683820] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.683952] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.684093] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.684218] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.684339] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.684457] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.684573] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.684685] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.684836] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1800.710951] env[67169]: WARNING oslo_vmware.rw_handles [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1800.710951] env[67169]: ERROR oslo_vmware.rw_handles [ 1800.711656] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/2ff600f3-188a-45f1-959e-bbadda6d695c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1800.713207] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1800.713441] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Copying Virtual Disk [datastore2] vmware_temp/2ff600f3-188a-45f1-959e-bbadda6d695c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/2ff600f3-188a-45f1-959e-bbadda6d695c/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1800.713729] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dc0be1cc-7dce-406f-958e-fd00797661dd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.723392] env[67169]: DEBUG oslo_vmware.api [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Waiting for the task: (returnval){ [ 1800.723392] env[67169]: value = "task-2819234" [ 1800.723392] env[67169]: _type = "Task" [ 1800.723392] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1800.731132] env[67169]: DEBUG oslo_vmware.api [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Task: {'id': task-2819234, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1801.233724] env[67169]: DEBUG oslo_vmware.exceptions [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1801.234038] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1801.234636] env[67169]: ERROR nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1801.234636] env[67169]: Faults: ['InvalidArgument'] [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Traceback (most recent call last): [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] yield resources [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self.driver.spawn(context, instance, image_meta, [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._fetch_image_if_missing(context, vi) [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] image_cache(vi, tmp_image_ds_loc) [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] vm_util.copy_virtual_disk( [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] session._wait_for_task(vmdk_copy_task) [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.wait_for_task(task_ref) [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return evt.wait() [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] result = hub.switch() [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.greenlet.switch() [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self.f(*self.args, **self.kw) [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] raise exceptions.translate_fault(task_info.error) [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Faults: ['InvalidArgument'] [ 1801.234636] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1801.235637] env[67169]: INFO nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Terminating instance [ 1801.236589] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1801.236802] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1801.237050] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9d9551a1-7bc1-45b7-83cc-fffdaaed28b9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.239187] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1801.239379] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1801.240101] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2416981-be1d-4879-9e67-e5ed2ed72a05 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.246730] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1801.246943] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9a0c7410-2b84-49c4-b2ec-e7bbbb627015 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.248971] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1801.249169] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1801.250123] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3126c6f8-c637-4e03-8123-368211ac7237 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.254450] env[67169]: DEBUG oslo_vmware.api [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 1801.254450] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]522e75b3-3480-23ed-fbeb-b5fa96a6376f" [ 1801.254450] env[67169]: _type = "Task" [ 1801.254450] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1801.261356] env[67169]: DEBUG oslo_vmware.api [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]522e75b3-3480-23ed-fbeb-b5fa96a6376f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1801.325200] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1801.325412] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1801.325591] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Deleting the datastore file [datastore2] 74ea66f0-391c-437b-8aee-f784528d7963 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1801.325853] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2a030cf2-d1d6-4415-9d18-b3586c54dcb9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.331941] env[67169]: DEBUG oslo_vmware.api [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Waiting for the task: (returnval){ [ 1801.331941] env[67169]: value = "task-2819236" [ 1801.331941] env[67169]: _type = "Task" [ 1801.331941] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1801.340063] env[67169]: DEBUG oslo_vmware.api [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Task: {'id': task-2819236, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1801.659106] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1801.769397] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1801.769830] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating directory with path [datastore2] vmware_temp/a4a4ee95-b009-400c-8ee7-10784e088f5c/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1801.770274] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-411aee0d-196f-43e9-84a5-8adfcd26549b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.783479] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Created directory with path [datastore2] vmware_temp/a4a4ee95-b009-400c-8ee7-10784e088f5c/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1801.783775] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Fetch image to [datastore2] vmware_temp/a4a4ee95-b009-400c-8ee7-10784e088f5c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1801.784115] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/a4a4ee95-b009-400c-8ee7-10784e088f5c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1801.785326] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1456c63a-666b-4dd5-ab4d-a855eab52a6f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.794009] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2208a955-6008-4e20-b9a4-14265bd17630 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.803232] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-059d021d-7082-4e02-be4a-118292e95ef0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.832933] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebf29db3-fedc-438a-bd9d-26416e89382d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.844457] env[67169]: DEBUG oslo_vmware.api [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Task: {'id': task-2819236, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085235} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1801.844657] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8deab556-59d9-423b-809f-bf4ad8ad40b2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1801.846303] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1801.846493] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1801.846664] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1801.846840] env[67169]: INFO nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1801.848965] env[67169]: DEBUG nova.compute.claims [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1801.849158] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1801.849380] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1801.870663] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1801.927561] env[67169]: DEBUG oslo_vmware.rw_handles [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a4a4ee95-b009-400c-8ee7-10784e088f5c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1801.990081] env[67169]: DEBUG oslo_vmware.rw_handles [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1801.990356] env[67169]: DEBUG oslo_vmware.rw_handles [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a4a4ee95-b009-400c-8ee7-10784e088f5c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1801.995133] env[67169]: DEBUG nova.scheduler.client.report [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Refreshing inventories for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1802.009679] env[67169]: DEBUG nova.scheduler.client.report [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Updating ProviderTree inventory for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1802.009923] env[67169]: DEBUG nova.compute.provider_tree [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Updating inventory in ProviderTree for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1802.021512] env[67169]: DEBUG nova.scheduler.client.report [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Refreshing aggregate associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, aggregates: None {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1802.039063] env[67169]: DEBUG nova.scheduler.client.report [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Refreshing trait associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1802.190974] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d05681d8-a6f1-4e99-b2d1-94b8ed97a0d9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.197765] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4d1f509-90d2-4e13-9152-76ab80c75ff2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.228666] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97a38068-b45f-4091-9e85-f3ef4a2e7436 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.235761] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff5bbc10-c824-47c8-b784-0834120bdf30 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.248536] env[67169]: DEBUG nova.compute.provider_tree [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1802.279983] env[67169]: DEBUG nova.scheduler.client.report [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1802.294715] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.445s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.295302] env[67169]: ERROR nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1802.295302] env[67169]: Faults: ['InvalidArgument'] [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Traceback (most recent call last): [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self.driver.spawn(context, instance, image_meta, [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._fetch_image_if_missing(context, vi) [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] image_cache(vi, tmp_image_ds_loc) [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] vm_util.copy_virtual_disk( [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] session._wait_for_task(vmdk_copy_task) [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.wait_for_task(task_ref) [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return evt.wait() [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] result = hub.switch() [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.greenlet.switch() [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self.f(*self.args, **self.kw) [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] raise exceptions.translate_fault(task_info.error) [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Faults: ['InvalidArgument'] [ 1802.295302] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.296302] env[67169]: DEBUG nova.compute.utils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1802.297462] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Build of instance 74ea66f0-391c-437b-8aee-f784528d7963 was re-scheduled: A specified parameter was not correct: fileType [ 1802.297462] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1802.297831] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1802.298009] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1802.298254] env[67169]: DEBUG nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1802.298443] env[67169]: DEBUG nova.network.neutron [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1802.408612] env[67169]: DEBUG neutronclient.v2_0.client [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67169) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1802.409716] env[67169]: ERROR nova.compute.manager [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Traceback (most recent call last): [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self.driver.spawn(context, instance, image_meta, [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._fetch_image_if_missing(context, vi) [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] image_cache(vi, tmp_image_ds_loc) [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] vm_util.copy_virtual_disk( [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] session._wait_for_task(vmdk_copy_task) [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.wait_for_task(task_ref) [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return evt.wait() [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] result = hub.switch() [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.greenlet.switch() [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self.f(*self.args, **self.kw) [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] raise exceptions.translate_fault(task_info.error) [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Faults: ['InvalidArgument'] [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] During handling of the above exception, another exception occurred: [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Traceback (most recent call last): [ 1802.409716] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._build_and_run_instance(context, instance, image, [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] raise exception.RescheduledException( [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] nova.exception.RescheduledException: Build of instance 74ea66f0-391c-437b-8aee-f784528d7963 was re-scheduled: A specified parameter was not correct: fileType [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Faults: ['InvalidArgument'] [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] During handling of the above exception, another exception occurred: [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Traceback (most recent call last): [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] exception_handler_v20(status_code, error_body) [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] raise client_exc(message=error_message, [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Neutron server returns request_ids: ['req-718d5f34-9374-42f3-8bc4-b3f561f471fc'] [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] During handling of the above exception, another exception occurred: [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Traceback (most recent call last): [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._deallocate_network(context, instance, requested_networks) [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self.network_api.deallocate_for_instance( [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] data = neutron.list_ports(**search_opts) [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.list('ports', self.ports_path, retrieve_all, [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.410653] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] for r in self._pagination(collection, path, **params): [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] res = self.get(path, params=params) [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.retry_request("GET", action, body=body, [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.do_request(method, action, body=body, [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._handle_fault_response(status_code, replybody, resp) [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] raise exception.Unauthorized() [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] nova.exception.Unauthorized: Not authorized. [ 1802.412109] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.463591] env[67169]: INFO nova.scheduler.client.report [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Deleted allocations for instance 74ea66f0-391c-437b-8aee-f784528d7963 [ 1802.481528] env[67169]: DEBUG oslo_concurrency.lockutils [None req-69385497-98b5-454b-af8f-b152ac80f6f3 tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "74ea66f0-391c-437b-8aee-f784528d7963" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 608.767s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.482547] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "74ea66f0-391c-437b-8aee-f784528d7963" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 413.098s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1802.482762] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Acquiring lock "74ea66f0-391c-437b-8aee-f784528d7963-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1802.482970] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "74ea66f0-391c-437b-8aee-f784528d7963-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1802.483151] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "74ea66f0-391c-437b-8aee-f784528d7963-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.484958] env[67169]: INFO nova.compute.manager [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Terminating instance [ 1802.486598] env[67169]: DEBUG nova.compute.manager [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1802.486797] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1802.487293] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-84102008-844b-44e5-b8fa-cddb6c4819e9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.492377] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1802.498820] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27170a3a-c350-4d0b-a745-c1d32103d178 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.527900] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 74ea66f0-391c-437b-8aee-f784528d7963 could not be found. [ 1802.528127] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1802.528309] env[67169]: INFO nova.compute.manager [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1802.528549] env[67169]: DEBUG oslo.service.loopingcall [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1802.530752] env[67169]: DEBUG nova.compute.manager [-] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1802.530871] env[67169]: DEBUG nova.network.neutron [-] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1802.544016] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1802.544260] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1802.545659] env[67169]: INFO nova.compute.claims [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1802.623243] env[67169]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67169) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1802.623243] env[67169]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-df91011b-7ae4-4109-864b-05dbca6bec85'] [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1802.623658] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1802.625278] env[67169]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1802.625278] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1802.625278] env[67169]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1802.625278] env[67169]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1802.625278] env[67169]: ERROR oslo.service.loopingcall [ 1802.625278] env[67169]: ERROR nova.compute.manager [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1802.658607] env[67169]: ERROR nova.compute.manager [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Traceback (most recent call last): [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] exception_handler_v20(status_code, error_body) [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] raise client_exc(message=error_message, [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Neutron server returns request_ids: ['req-df91011b-7ae4-4109-864b-05dbca6bec85'] [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] During handling of the above exception, another exception occurred: [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Traceback (most recent call last): [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._delete_instance(context, instance, bdms) [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._shutdown_instance(context, instance, bdms) [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._try_deallocate_network(context, instance, requested_networks) [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] with excutils.save_and_reraise_exception(): [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self.force_reraise() [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] raise self.value [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] _deallocate_network_with_retries() [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return evt.wait() [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1802.658607] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] result = hub.switch() [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.greenlet.switch() [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] result = func(*self.args, **self.kw) [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] result = f(*args, **kwargs) [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._deallocate_network( [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self.network_api.deallocate_for_instance( [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] data = neutron.list_ports(**search_opts) [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.list('ports', self.ports_path, retrieve_all, [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] for r in self._pagination(collection, path, **params): [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] res = self.get(path, params=params) [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.retry_request("GET", action, body=body, [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] return self.do_request(method, action, body=body, [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] ret = obj(*args, **kwargs) [ 1802.659837] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1802.660947] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] self._handle_fault_response(status_code, replybody, resp) [ 1802.660947] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1802.660947] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1802.660947] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1802.660947] env[67169]: ERROR nova.compute.manager [instance: 74ea66f0-391c-437b-8aee-f784528d7963] [ 1802.660947] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1802.661666] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1802.662051] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1802.686220] env[67169]: DEBUG oslo_concurrency.lockutils [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Lock "74ea66f0-391c-437b-8aee-f784528d7963" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.204s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.737118] env[67169]: INFO nova.compute.manager [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] [instance: 74ea66f0-391c-437b-8aee-f784528d7963] Successfully reverted task state from None on failure for instance. [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server [None req-f8f562ca-4368-425e-8b80-e5b3c615553d tempest-ServerExternalEventsTest-1384768552 tempest-ServerExternalEventsTest-1384768552-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-df91011b-7ae4-4109-864b-05dbca6bec85'] [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1802.740584] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1802.742240] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1802.744064] env[67169]: ERROR oslo_messaging.rpc.server [ 1802.745359] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28892d05-b323-4164-ad09-aa812f2d83db {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.753015] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2480a112-aa64-4593-b1bb-a8269c8c3889 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.783055] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56940bf9-1904-4289-a255-9182cbaed9ac {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.789820] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95e973c-bfd3-487e-b207-4d8f1fd3728b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.802362] env[67169]: DEBUG nova.compute.provider_tree [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1802.811676] env[67169]: DEBUG nova.scheduler.client.report [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1802.826055] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.826505] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1802.856495] env[67169]: DEBUG nova.compute.utils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1802.857654] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1802.857823] env[67169]: DEBUG nova.network.neutron [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1802.868593] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1802.927204] env[67169]: DEBUG nova.policy [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0774a8c3e4b047db8ea4103a26e66ff6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '270103faa5c446cd8db3a5908a04a3eb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1802.931044] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1802.955766] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1802.955999] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1802.956172] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1802.956354] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1802.956497] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1802.956640] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1802.956841] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1802.956999] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1802.957181] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1802.957339] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1802.957504] env[67169]: DEBUG nova.virt.hardware [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1802.958368] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dc9b650-edd7-4ea6-ae39-592e704850be {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.966062] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ba0a311-abd2-409f-8fbe-cfcd596c4e26 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.267083] env[67169]: DEBUG nova.network.neutron [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Successfully created port: d87d9719-c5f6-4ac6-bd6a-31c3210d90f6 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1803.630176] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1803.662255] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Getting list of instances from cluster (obj){ [ 1803.662255] env[67169]: value = "domain-c8" [ 1803.662255] env[67169]: _type = "ClusterComputeResource" [ 1803.662255] env[67169]: } {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1803.663631] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a350692-90bc-4897-9d08-9d25d60e47ca {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.682766] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Got total of 9 instances {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1803.682956] env[67169]: WARNING nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] While synchronizing instance power states, found 10 instances in the database and 9 instances on the hypervisor. [ 1803.683123] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 48376572-9e3a-4579-b2d7-b8b63312fab1 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.683335] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 7b7c8f84-c2d4-442e-93d3-60124767d096 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.683498] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 2e156908-c313-4229-840d-13ed8e6d4074 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.683903] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 2d7d3386-9854-4bf1-a680-5aed0a2329cb {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.684132] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.684257] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.684426] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid aedbfde6-26e1-410d-a311-e2c344f65062 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.684576] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid c05c3ec2-a68d-41b0-a199-fcfc84bb2deb {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.684781] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 9435574d-2128-4b20-ba92-ee2aba37d33b {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.684934] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Triggering sync for uuid 6663b166-0d24-45a7-8c2c-e4e68dbe0005 {{(pid=67169) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1803.685369] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "48376572-9e3a-4579-b2d7-b8b63312fab1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.685643] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "7b7c8f84-c2d4-442e-93d3-60124767d096" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.685868] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "2e156908-c313-4229-840d-13ed8e6d4074" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.686335] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.686335] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.686544] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.686804] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "aedbfde6-26e1-410d-a311-e2c344f65062" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.687011] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.687221] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "9435574d-2128-4b20-ba92-ee2aba37d33b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.687416] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.710689] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1804.008833] env[67169]: DEBUG nova.compute.manager [req-0b29313f-6b6e-459f-878b-117992e6aef5 req-f9405aad-c25b-4952-bda6-d64583938d90 service nova] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Received event network-vif-plugged-d87d9719-c5f6-4ac6-bd6a-31c3210d90f6 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1804.009146] env[67169]: DEBUG oslo_concurrency.lockutils [req-0b29313f-6b6e-459f-878b-117992e6aef5 req-f9405aad-c25b-4952-bda6-d64583938d90 service nova] Acquiring lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.009339] env[67169]: DEBUG oslo_concurrency.lockutils [req-0b29313f-6b6e-459f-878b-117992e6aef5 req-f9405aad-c25b-4952-bda6-d64583938d90 service nova] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1804.009517] env[67169]: DEBUG oslo_concurrency.lockutils [req-0b29313f-6b6e-459f-878b-117992e6aef5 req-f9405aad-c25b-4952-bda6-d64583938d90 service nova] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1804.009684] env[67169]: DEBUG nova.compute.manager [req-0b29313f-6b6e-459f-878b-117992e6aef5 req-f9405aad-c25b-4952-bda6-d64583938d90 service nova] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] No waiting events found dispatching network-vif-plugged-d87d9719-c5f6-4ac6-bd6a-31c3210d90f6 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1804.009846] env[67169]: WARNING nova.compute.manager [req-0b29313f-6b6e-459f-878b-117992e6aef5 req-f9405aad-c25b-4952-bda6-d64583938d90 service nova] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Received unexpected event network-vif-plugged-d87d9719-c5f6-4ac6-bd6a-31c3210d90f6 for instance with vm_state building and task_state spawning. [ 1804.084020] env[67169]: DEBUG nova.network.neutron [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Successfully updated port: d87d9719-c5f6-4ac6-bd6a-31c3210d90f6 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1804.094958] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquiring lock "refresh_cache-6663b166-0d24-45a7-8c2c-e4e68dbe0005" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1804.095135] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquired lock "refresh_cache-6663b166-0d24-45a7-8c2c-e4e68dbe0005" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1804.095291] env[67169]: DEBUG nova.network.neutron [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1804.152787] env[67169]: DEBUG nova.network.neutron [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1804.302108] env[67169]: DEBUG nova.network.neutron [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Updating instance_info_cache with network_info: [{"id": "d87d9719-c5f6-4ac6-bd6a-31c3210d90f6", "address": "fa:16:3e:ce:5c:12", "network": {"id": "31b18cd6-ff93-43b5-9bb5-a32a622f1604", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-2075352683-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "270103faa5c446cd8db3a5908a04a3eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "359c2c31-99c4-41d7-a513-3bc4825897a0", "external-id": "nsx-vlan-transportzone-173", "segmentation_id": 173, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd87d9719-c5", "ovs_interfaceid": "d87d9719-c5f6-4ac6-bd6a-31c3210d90f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1804.318637] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Releasing lock "refresh_cache-6663b166-0d24-45a7-8c2c-e4e68dbe0005" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1804.318954] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Instance network_info: |[{"id": "d87d9719-c5f6-4ac6-bd6a-31c3210d90f6", "address": "fa:16:3e:ce:5c:12", "network": {"id": "31b18cd6-ff93-43b5-9bb5-a32a622f1604", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-2075352683-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "270103faa5c446cd8db3a5908a04a3eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "359c2c31-99c4-41d7-a513-3bc4825897a0", "external-id": "nsx-vlan-transportzone-173", "segmentation_id": 173, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd87d9719-c5", "ovs_interfaceid": "d87d9719-c5f6-4ac6-bd6a-31c3210d90f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1804.319361] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ce:5c:12', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '359c2c31-99c4-41d7-a513-3bc4825897a0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd87d9719-c5f6-4ac6-bd6a-31c3210d90f6', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1804.327143] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Creating folder: Project (270103faa5c446cd8db3a5908a04a3eb). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1804.327596] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-70026b38-2d0d-480c-ad87-42ca03d54662 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.337960] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Created folder: Project (270103faa5c446cd8db3a5908a04a3eb) in parent group-v566843. [ 1804.338147] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Creating folder: Instances. Parent ref: group-v566942. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1804.338356] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-612e147f-e08b-4c1b-8722-b51058a7d29f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.345859] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Created folder: Instances in parent group-v566942. [ 1804.346093] env[67169]: DEBUG oslo.service.loopingcall [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1804.346272] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1804.346458] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c4f43d37-4646-4f8c-abd9-14f746377758 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.364314] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1804.364314] env[67169]: value = "task-2819239" [ 1804.364314] env[67169]: _type = "Task" [ 1804.364314] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1804.371291] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819239, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1804.658254] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1804.874183] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819239, 'name': CreateVM_Task, 'duration_secs': 0.287701} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1804.874366] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1804.875022] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1804.875203] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1804.875523] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1804.875771] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-70c32041-4074-442e-afad-884ff10903ee {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.880083] env[67169]: DEBUG oslo_vmware.api [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Waiting for the task: (returnval){ [ 1804.880083] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]523884f6-2d6b-30ff-627b-15bcfc68e7e2" [ 1804.880083] env[67169]: _type = "Task" [ 1804.880083] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1804.888652] env[67169]: DEBUG oslo_vmware.api [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]523884f6-2d6b-30ff-627b-15bcfc68e7e2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1805.390186] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1805.390538] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1805.390629] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1805.658791] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1805.670381] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1805.670606] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1805.670791] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1805.670927] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1805.672041] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-381ca508-d104-479c-9e51-5d6aacd7ca79 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.680859] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceaec563-33b7-47f3-bc0f-f98094a1d2d8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.694776] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-beb6692f-6b8e-4e74-bafa-10fecbf485c0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.701129] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dc40027-db28-4ba2-90a0-feb3fa77bf7e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.730460] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181027MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1805.730622] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1805.730793] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1805.804159] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.804326] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.804464] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.804587] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.804704] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.804820] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.804936] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.805088] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.805300] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.805499] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.816931] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 115d6c00-4259-4e87-aa00-90b576a63535 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1805.827401] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1805.827626] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1805.827772] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1805.970999] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7073a48c-1b8f-46b2-81b9-e7388e6e18b1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.978589] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40273e40-162a-4069-b5d9-34e2f8a31934 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.008389] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-039a131d-2a18-450e-b793-18eccf69aea9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.015798] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1887cfe4-99a8-4f0c-af61-f4c9c54d72c4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.029120] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1806.035167] env[67169]: DEBUG nova.compute.manager [req-613f3c2c-ad84-4a4f-b9e2-c41c52178c80 req-e28576c8-cfd2-4105-a40a-77f23af489b3 service nova] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Received event network-changed-d87d9719-c5f6-4ac6-bd6a-31c3210d90f6 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1806.035364] env[67169]: DEBUG nova.compute.manager [req-613f3c2c-ad84-4a4f-b9e2-c41c52178c80 req-e28576c8-cfd2-4105-a40a-77f23af489b3 service nova] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Refreshing instance network info cache due to event network-changed-d87d9719-c5f6-4ac6-bd6a-31c3210d90f6. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1806.035576] env[67169]: DEBUG oslo_concurrency.lockutils [req-613f3c2c-ad84-4a4f-b9e2-c41c52178c80 req-e28576c8-cfd2-4105-a40a-77f23af489b3 service nova] Acquiring lock "refresh_cache-6663b166-0d24-45a7-8c2c-e4e68dbe0005" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1806.035716] env[67169]: DEBUG oslo_concurrency.lockutils [req-613f3c2c-ad84-4a4f-b9e2-c41c52178c80 req-e28576c8-cfd2-4105-a40a-77f23af489b3 service nova] Acquired lock "refresh_cache-6663b166-0d24-45a7-8c2c-e4e68dbe0005" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1806.035930] env[67169]: DEBUG nova.network.neutron [req-613f3c2c-ad84-4a4f-b9e2-c41c52178c80 req-e28576c8-cfd2-4105-a40a-77f23af489b3 service nova] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Refreshing network info cache for port d87d9719-c5f6-4ac6-bd6a-31c3210d90f6 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1806.039105] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1806.052610] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1806.052848] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.322s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1806.331846] env[67169]: DEBUG nova.network.neutron [req-613f3c2c-ad84-4a4f-b9e2-c41c52178c80 req-e28576c8-cfd2-4105-a40a-77f23af489b3 service nova] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Updated VIF entry in instance network info cache for port d87d9719-c5f6-4ac6-bd6a-31c3210d90f6. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1806.332269] env[67169]: DEBUG nova.network.neutron [req-613f3c2c-ad84-4a4f-b9e2-c41c52178c80 req-e28576c8-cfd2-4105-a40a-77f23af489b3 service nova] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Updating instance_info_cache with network_info: [{"id": "d87d9719-c5f6-4ac6-bd6a-31c3210d90f6", "address": "fa:16:3e:ce:5c:12", "network": {"id": "31b18cd6-ff93-43b5-9bb5-a32a622f1604", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-2075352683-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "270103faa5c446cd8db3a5908a04a3eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "359c2c31-99c4-41d7-a513-3bc4825897a0", "external-id": "nsx-vlan-transportzone-173", "segmentation_id": 173, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd87d9719-c5", "ovs_interfaceid": "d87d9719-c5f6-4ac6-bd6a-31c3210d90f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1806.341583] env[67169]: DEBUG oslo_concurrency.lockutils [req-613f3c2c-ad84-4a4f-b9e2-c41c52178c80 req-e28576c8-cfd2-4105-a40a-77f23af489b3 service nova] Releasing lock "refresh_cache-6663b166-0d24-45a7-8c2c-e4e68dbe0005" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1809.053906] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1812.382356] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1812.382735] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.494510] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquiring lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1850.775789] env[67169]: WARNING oslo_vmware.rw_handles [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1850.775789] env[67169]: ERROR oslo_vmware.rw_handles [ 1850.776578] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/a4a4ee95-b009-400c-8ee7-10784e088f5c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1850.778203] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1850.778450] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Copying Virtual Disk [datastore2] vmware_temp/a4a4ee95-b009-400c-8ee7-10784e088f5c/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/a4a4ee95-b009-400c-8ee7-10784e088f5c/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1850.778736] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ba7d10d4-a542-4d79-b531-064497234b5c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.786708] env[67169]: DEBUG oslo_vmware.api [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 1850.786708] env[67169]: value = "task-2819240" [ 1850.786708] env[67169]: _type = "Task" [ 1850.786708] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1850.794412] env[67169]: DEBUG oslo_vmware.api [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': task-2819240, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1851.297597] env[67169]: DEBUG oslo_vmware.exceptions [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1851.297947] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1851.298526] env[67169]: ERROR nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1851.298526] env[67169]: Faults: ['InvalidArgument'] [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Traceback (most recent call last): [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] yield resources [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] self.driver.spawn(context, instance, image_meta, [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] self._fetch_image_if_missing(context, vi) [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] image_cache(vi, tmp_image_ds_loc) [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] vm_util.copy_virtual_disk( [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] session._wait_for_task(vmdk_copy_task) [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] return self.wait_for_task(task_ref) [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] return evt.wait() [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] result = hub.switch() [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] return self.greenlet.switch() [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] self.f(*self.args, **self.kw) [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] raise exceptions.translate_fault(task_info.error) [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Faults: ['InvalidArgument'] [ 1851.298526] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] [ 1851.299588] env[67169]: INFO nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Terminating instance [ 1851.300766] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1851.300766] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1851.300914] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2e83a8d1-859e-4058-bdfa-da25132e0ff7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.303420] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1851.303578] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1851.304198] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4057aaa7-8cb6-4355-95da-3804adca6926 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.310876] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1851.311093] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0ef61eef-4331-42d3-a6f8-67e0bfcf94d7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.313224] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1851.313398] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1851.314354] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1ab6e472-6730-45bf-b46a-85bf0586d2d4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.319132] env[67169]: DEBUG oslo_vmware.api [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Waiting for the task: (returnval){ [ 1851.319132] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a58ffa-3df1-6939-245c-e8d860aa035d" [ 1851.319132] env[67169]: _type = "Task" [ 1851.319132] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1851.325922] env[67169]: DEBUG oslo_vmware.api [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52a58ffa-3df1-6939-245c-e8d860aa035d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1851.378265] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1851.378570] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1851.378676] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Deleting the datastore file [datastore2] 48376572-9e3a-4579-b2d7-b8b63312fab1 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1851.378922] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-42bf16f6-25c6-443d-811b-98fb897fcfa0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.384586] env[67169]: DEBUG oslo_vmware.api [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 1851.384586] env[67169]: value = "task-2819242" [ 1851.384586] env[67169]: _type = "Task" [ 1851.384586] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1851.391777] env[67169]: DEBUG oslo_vmware.api [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': task-2819242, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1851.829037] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1851.829379] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Creating directory with path [datastore2] vmware_temp/70ceae6a-5cb7-437d-b254-710ce0e12331/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1851.829564] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f668f563-3fce-411d-9073-97abedb2671e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.840957] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Created directory with path [datastore2] vmware_temp/70ceae6a-5cb7-437d-b254-710ce0e12331/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1851.841155] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Fetch image to [datastore2] vmware_temp/70ceae6a-5cb7-437d-b254-710ce0e12331/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1851.841324] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/70ceae6a-5cb7-437d-b254-710ce0e12331/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1851.842228] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ac57ea3-7dff-4c62-965f-a6b03e50c753 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.848288] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3996b3bf-a452-484e-b0e8-1ce9040f9289 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.857051] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b3c90c7-ca3e-4ef7-ad51-5228519396ba {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.890861] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ef11180-9ca2-44ab-a90e-6770ebb8f4e7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.897399] env[67169]: DEBUG oslo_vmware.api [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': task-2819242, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07875} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1851.898800] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1851.898992] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1851.899177] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1851.899399] env[67169]: INFO nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1851.901354] env[67169]: DEBUG nova.compute.claims [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1851.901523] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1851.901733] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1851.904200] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e3d957a3-fc02-4860-971b-6e29fa93c4f7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.924331] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1852.059298] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1852.060101] env[67169]: ERROR nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] result = getattr(controller, method)(*args, **kwargs) [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._get(image_id) [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] resp, body = self.http_client.get(url, headers=header) [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.request(url, 'GET', **kwargs) [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._handle_response(resp) [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise exc.from_response(resp, resp.content) [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] During handling of the above exception, another exception occurred: [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] yield resources [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self.driver.spawn(context, instance, image_meta, [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1852.060101] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._fetch_image_if_missing(context, vi) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] image_fetch(context, vi, tmp_image_ds_loc) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] images.fetch_image( [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] metadata = IMAGE_API.get(context, image_ref) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return session.show(context, image_id, [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] _reraise_translated_image_exception(image_id) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise new_exc.with_traceback(exc_trace) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] result = getattr(controller, method)(*args, **kwargs) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._get(image_id) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] resp, body = self.http_client.get(url, headers=header) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.request(url, 'GET', **kwargs) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._handle_response(resp) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise exc.from_response(resp, resp.content) [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1852.061634] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1852.061634] env[67169]: INFO nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Terminating instance [ 1852.063276] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1852.063276] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1852.063276] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1852.063276] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1852.063276] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cea89660-49e4-4d48-be3d-9ea72bb01a19 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.066014] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc734810-339c-42b0-a616-8655fac8b5e3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.075230] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1852.075517] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-be070457-343c-42dc-853d-b69b9536e69f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.077815] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1852.077991] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1852.081202] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d3f16a8-9423-4eaf-ace8-696adf894ea5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.086386] env[67169]: DEBUG oslo_vmware.api [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Waiting for the task: (returnval){ [ 1852.086386] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]520b7fdb-852d-4ee2-ae17-b4d6c41c2f60" [ 1852.086386] env[67169]: _type = "Task" [ 1852.086386] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1852.093638] env[67169]: DEBUG oslo_vmware.api [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]520b7fdb-852d-4ee2-ae17-b4d6c41c2f60, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.116274] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ac3229c-1915-4c70-ae8b-2867e82e52de {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.123396] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-526dba14-39e4-4848-9366-0f2a85382ed9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.154589] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d0f116c-81ca-47d7-aaaf-f131a1298fec {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.162297] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5dd13a7-25a1-444c-9dd8-c237ae65393a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.166171] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1852.166375] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1852.166553] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Deleting the datastore file [datastore2] 7b7c8f84-c2d4-442e-93d3-60124767d096 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1852.166768] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5e3fb008-70db-434a-80cc-d86562aa8748 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.177638] env[67169]: DEBUG nova.compute.provider_tree [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1852.179683] env[67169]: DEBUG oslo_vmware.api [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Waiting for the task: (returnval){ [ 1852.179683] env[67169]: value = "task-2819244" [ 1852.179683] env[67169]: _type = "Task" [ 1852.179683] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1852.186756] env[67169]: DEBUG oslo_vmware.api [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Task: {'id': task-2819244, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.188070] env[67169]: DEBUG nova.scheduler.client.report [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1852.201761] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.300s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1852.202294] env[67169]: ERROR nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1852.202294] env[67169]: Faults: ['InvalidArgument'] [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Traceback (most recent call last): [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] self.driver.spawn(context, instance, image_meta, [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] self._fetch_image_if_missing(context, vi) [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] image_cache(vi, tmp_image_ds_loc) [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] vm_util.copy_virtual_disk( [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] session._wait_for_task(vmdk_copy_task) [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] return self.wait_for_task(task_ref) [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] return evt.wait() [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] result = hub.switch() [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] return self.greenlet.switch() [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] self.f(*self.args, **self.kw) [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] raise exceptions.translate_fault(task_info.error) [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Faults: ['InvalidArgument'] [ 1852.202294] env[67169]: ERROR nova.compute.manager [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] [ 1852.203562] env[67169]: DEBUG nova.compute.utils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1852.204570] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Build of instance 48376572-9e3a-4579-b2d7-b8b63312fab1 was re-scheduled: A specified parameter was not correct: fileType [ 1852.204570] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1852.204946] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1852.205128] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1852.205297] env[67169]: DEBUG nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1852.205459] env[67169]: DEBUG nova.network.neutron [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1852.529330] env[67169]: DEBUG nova.network.neutron [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1852.546775] env[67169]: INFO nova.compute.manager [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Took 0.34 seconds to deallocate network for instance. [ 1852.601558] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1852.602311] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Creating directory with path [datastore2] vmware_temp/89170a22-a055-49f9-952e-85d7a22a5fb7/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1852.602311] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-369a2d07-9496-4149-8d01-86e1a48e00e1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.613128] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Created directory with path [datastore2] vmware_temp/89170a22-a055-49f9-952e-85d7a22a5fb7/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1852.613331] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Fetch image to [datastore2] vmware_temp/89170a22-a055-49f9-952e-85d7a22a5fb7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1852.613504] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/89170a22-a055-49f9-952e-85d7a22a5fb7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1852.614286] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa89ff4e-8013-457f-ada3-849226e8e97a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.621884] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-777e991e-9121-4d9c-9420-5a6b798c17db {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.634017] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5d15b72-4a1a-4d6a-9594-5af2a6351177 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.666422] env[67169]: INFO nova.scheduler.client.report [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Deleted allocations for instance 48376572-9e3a-4579-b2d7-b8b63312fab1 [ 1852.672370] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d13f9e74-b376-460b-bbf6-162026484564 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.679580] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d2ac91d6-bee6-4cb6-bb6a-af0b2498fffd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.689537] env[67169]: DEBUG oslo_vmware.api [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Task: {'id': task-2819244, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.103521} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1852.690292] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1852.690495] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1852.690664] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1852.690832] env[67169]: INFO nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1852.692515] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5ef31fa3-d243-488f-9847-4509615e7a51 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 672.712s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1852.692925] env[67169]: DEBUG nova.compute.claims [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1852.693096] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.693371] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1852.696934] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 476.415s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1852.697274] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "48376572-9e3a-4579-b2d7-b8b63312fab1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.697593] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1852.697800] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1852.699892] env[67169]: INFO nova.compute.manager [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Terminating instance [ 1852.701528] env[67169]: DEBUG nova.compute.manager [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1852.701718] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1852.701966] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ec706b25-b9da-4415-8cac-ee512aa23b80 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.706459] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1852.708406] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1852.717305] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d008732-0bf0-461a-83e4-7f54b9c5ed60 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.747075] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 48376572-9e3a-4579-b2d7-b8b63312fab1 could not be found. [ 1852.747346] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1852.747517] env[67169]: INFO nova.compute.manager [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1852.747800] env[67169]: DEBUG oslo.service.loopingcall [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1852.750273] env[67169]: DEBUG nova.compute.manager [-] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1852.750399] env[67169]: DEBUG nova.network.neutron [-] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1852.768550] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/89170a22-a055-49f9-952e-85d7a22a5fb7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1852.833229] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1852.833490] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/89170a22-a055-49f9-952e-85d7a22a5fb7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1852.835272] env[67169]: DEBUG nova.network.neutron [-] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1852.843850] env[67169]: INFO nova.compute.manager [-] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] Took 0.09 seconds to deallocate network for instance. [ 1852.850975] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.929932] env[67169]: DEBUG oslo_concurrency.lockutils [None req-faee3827-0c6f-4db2-a881-8bf0ff06a871 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.233s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1852.933961] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 49.246s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1852.933961] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 48376572-9e3a-4579-b2d7-b8b63312fab1] During sync_power_state the instance has a pending task (deleting). Skip. [ 1852.933961] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "48376572-9e3a-4579-b2d7-b8b63312fab1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1852.974188] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24344d9c-98c4-4073-9dc0-976b61c90afe {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.983294] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0bae022-e387-45d1-b3e5-3206f7828843 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.014785] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a83a1266-4204-4a5b-9390-da7b34989a4c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.021792] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5fe9436-c4cf-4a69-a444-df42f83cd291 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.034576] env[67169]: DEBUG nova.compute.provider_tree [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1853.042445] env[67169]: DEBUG nova.scheduler.client.report [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1853.054323] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.361s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.055010] env[67169]: ERROR nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] result = getattr(controller, method)(*args, **kwargs) [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._get(image_id) [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] resp, body = self.http_client.get(url, headers=header) [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.request(url, 'GET', **kwargs) [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._handle_response(resp) [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise exc.from_response(resp, resp.content) [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] During handling of the above exception, another exception occurred: [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self.driver.spawn(context, instance, image_meta, [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._fetch_image_if_missing(context, vi) [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1853.055010] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] image_fetch(context, vi, tmp_image_ds_loc) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] images.fetch_image( [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] metadata = IMAGE_API.get(context, image_ref) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return session.show(context, image_id, [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] _reraise_translated_image_exception(image_id) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise new_exc.with_traceback(exc_trace) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] result = getattr(controller, method)(*args, **kwargs) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._get(image_id) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] resp, body = self.http_client.get(url, headers=header) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.request(url, 'GET', **kwargs) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._handle_response(resp) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise exc.from_response(resp, resp.content) [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1853.056040] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.056040] env[67169]: DEBUG nova.compute.utils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1853.056894] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.206s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.058097] env[67169]: INFO nova.compute.claims [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1853.060485] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Build of instance 7b7c8f84-c2d4-442e-93d3-60124767d096 was re-scheduled: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1853.060955] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1853.061135] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1853.061302] env[67169]: DEBUG nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1853.061462] env[67169]: DEBUG nova.network.neutron [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1853.184335] env[67169]: DEBUG neutronclient.v2_0.client [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67169) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1853.185601] env[67169]: ERROR nova.compute.manager [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] result = getattr(controller, method)(*args, **kwargs) [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._get(image_id) [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] resp, body = self.http_client.get(url, headers=header) [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.request(url, 'GET', **kwargs) [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._handle_response(resp) [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise exc.from_response(resp, resp.content) [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] During handling of the above exception, another exception occurred: [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self.driver.spawn(context, instance, image_meta, [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._fetch_image_if_missing(context, vi) [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1853.185601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] image_fetch(context, vi, tmp_image_ds_loc) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] images.fetch_image( [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] metadata = IMAGE_API.get(context, image_ref) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return session.show(context, image_id, [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] _reraise_translated_image_exception(image_id) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise new_exc.with_traceback(exc_trace) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] result = getattr(controller, method)(*args, **kwargs) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._get(image_id) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] resp, body = self.http_client.get(url, headers=header) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.request(url, 'GET', **kwargs) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self._handle_response(resp) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise exc.from_response(resp, resp.content) [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] nova.exception.ImageNotAuthorized: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] During handling of the above exception, another exception occurred: [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._build_and_run_instance(context, instance, image, [ 1853.186512] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise exception.RescheduledException( [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] nova.exception.RescheduledException: Build of instance 7b7c8f84-c2d4-442e-93d3-60124767d096 was re-scheduled: Not authorized for image 285931c9-8b83-4997-8c4d-6a79005e36ba. [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] During handling of the above exception, another exception occurred: [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] exception_handler_v20(status_code, error_body) [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise client_exc(message=error_message, [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Neutron server returns request_ids: ['req-ceeb44bb-3851-46d8-83cc-272a1b79b3b7'] [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] During handling of the above exception, another exception occurred: [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._deallocate_network(context, instance, requested_networks) [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self.network_api.deallocate_for_instance( [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] data = neutron.list_ports(**search_opts) [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.list('ports', self.ports_path, retrieve_all, [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] for r in self._pagination(collection, path, **params): [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1853.187567] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] res = self.get(path, params=params) [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.retry_request("GET", action, body=body, [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.do_request(method, action, body=body, [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._handle_fault_response(status_code, replybody, resp) [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise exception.Unauthorized() [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] nova.exception.Unauthorized: Not authorized. [ 1853.188601] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.241799] env[67169]: INFO nova.scheduler.client.report [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Deleted allocations for instance 7b7c8f84-c2d4-442e-93d3-60124767d096 [ 1853.260305] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53b6f6e8-71b3-4038-8c4d-d2d43860a9fc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.264183] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2bf3268b-6d3a-4277-831a-07b32245375f tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 600.573s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.265291] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 404.743s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.265472] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Acquiring lock "7b7c8f84-c2d4-442e-93d3-60124767d096-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1853.265669] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.265837] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.270559] env[67169]: INFO nova.compute.manager [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Terminating instance [ 1853.272529] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1d84ab9-9268-4c44-9fd9-107bba5d637e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.276273] env[67169]: DEBUG nova.compute.manager [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1853.276508] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1853.276980] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-daf25238-fbc4-40ab-9557-8c77611e9421 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.282083] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1853.311177] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfdee2b8-20ed-464f-b8cd-971b6d4466ac {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.316476] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-002e3aef-1cfa-46e2-81b9-678db771ffec {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.334838] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c672dd90-d492-4d71-aabd-1ed60a427551 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.347231] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7b7c8f84-c2d4-442e-93d3-60124767d096 could not be found. [ 1853.347465] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1853.347653] env[67169]: INFO nova.compute.manager [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Took 0.07 seconds to destroy the instance on the hypervisor. [ 1853.347885] env[67169]: DEBUG oslo.service.loopingcall [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1853.348770] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1853.349342] env[67169]: DEBUG nova.compute.manager [-] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1853.349478] env[67169]: DEBUG nova.network.neutron [-] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1853.358990] env[67169]: DEBUG nova.compute.provider_tree [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1853.367958] env[67169]: DEBUG nova.scheduler.client.report [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1853.387865] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.387865] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1853.391230] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.042s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.392732] env[67169]: INFO nova.compute.claims [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1853.420920] env[67169]: DEBUG nova.compute.utils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1853.422516] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1853.422707] env[67169]: DEBUG nova.network.neutron [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1853.433881] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1853.480650] env[67169]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67169) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1853.480650] env[67169]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-b60b0705-1f96-4274-bab6-2b76d4719989'] [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1853.480650] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1853.482238] env[67169]: ERROR oslo.service.loopingcall [ 1853.482238] env[67169]: ERROR nova.compute.manager [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1853.505724] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1853.509646] env[67169]: ERROR nova.compute.manager [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] exception_handler_v20(status_code, error_body) [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise client_exc(message=error_message, [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Neutron server returns request_ids: ['req-b60b0705-1f96-4274-bab6-2b76d4719989'] [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] During handling of the above exception, another exception occurred: [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Traceback (most recent call last): [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._delete_instance(context, instance, bdms) [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._shutdown_instance(context, instance, bdms) [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._try_deallocate_network(context, instance, requested_networks) [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] with excutils.save_and_reraise_exception(): [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self.force_reraise() [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise self.value [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] _deallocate_network_with_retries() [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return evt.wait() [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1853.509646] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] result = hub.switch() [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.greenlet.switch() [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] result = func(*self.args, **self.kw) [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] result = f(*args, **kwargs) [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._deallocate_network( [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self.network_api.deallocate_for_instance( [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] data = neutron.list_ports(**search_opts) [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.list('ports', self.ports_path, retrieve_all, [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] for r in self._pagination(collection, path, **params): [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] res = self.get(path, params=params) [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.retry_request("GET", action, body=body, [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] return self.do_request(method, action, body=body, [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] ret = obj(*args, **kwargs) [ 1853.510659] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1853.511596] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] self._handle_fault_response(status_code, replybody, resp) [ 1853.511596] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1853.511596] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1853.511596] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1853.511596] env[67169]: ERROR nova.compute.manager [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] [ 1853.527006] env[67169]: DEBUG nova.policy [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc8f12a2682c4b79aabc2f87ed8678e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a5d2ec974f664a3a9407f7f3e86b4982', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1853.538225] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1853.538490] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1853.538652] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1853.538835] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1853.538983] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1853.539155] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1853.539361] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1853.539550] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1853.539718] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1853.539878] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1853.540059] env[67169]: DEBUG nova.virt.hardware [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1853.540924] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bdd9416-c246-4eff-a521-afc78ef01aff {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.546609] env[67169]: DEBUG oslo_concurrency.lockutils [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.281s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.549051] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 49.863s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.549173] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] During sync_power_state the instance has a pending task (deleting). Skip. [ 1853.549353] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "7b7c8f84-c2d4-442e-93d3-60124767d096" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.554438] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cd1e6e5-b750-4f41-b974-485b8ab7a80b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.607494] env[67169]: INFO nova.compute.manager [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] [instance: 7b7c8f84-c2d4-442e-93d3-60124767d096] Successfully reverted task state from None on failure for instance. [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server [None req-c0cc3160-c2ad-4006-b47c-6eb1dd79d4a5 tempest-ServersTestMultiNic-730437419 tempest-ServersTestMultiNic-730437419-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-b60b0705-1f96-4274-bab6-2b76d4719989'] [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1853.612009] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server raise self.value [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1853.613629] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1853.615256] env[67169]: ERROR oslo_messaging.rpc.server [ 1853.615256] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6d68257-339e-49fd-a88f-b73bde87fa17 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.622905] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27c6d6c7-084a-46c9-8deb-15686974af57 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.653009] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-132ea2da-0ce7-481a-b5f3-d4bd9e90af3e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.660062] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cb79265-23e9-47aa-b3ff-58d894aab282 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.674484] env[67169]: DEBUG nova.compute.provider_tree [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1853.682586] env[67169]: DEBUG nova.scheduler.client.report [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1853.700223] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.700798] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1853.732872] env[67169]: DEBUG nova.compute.utils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1853.733858] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1853.734029] env[67169]: DEBUG nova.network.neutron [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1853.743088] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1853.789969] env[67169]: DEBUG nova.policy [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '615c1061ae884c3b91ce1b072249717c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1162bad4f2e4722aed4ff2c657e9dc9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1853.824603] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1853.849559] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1853.849854] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1853.850015] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1853.850154] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1853.850304] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1853.850478] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1853.850719] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1853.850880] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1853.851065] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1853.851232] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1853.851402] env[67169]: DEBUG nova.virt.hardware [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1853.852294] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed439d98-f089-41d9-a5a6-29eef5de2eff {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.855666] env[67169]: DEBUG nova.network.neutron [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Successfully created port: b2a986bc-e8e6-429b-b50f-13ead41b643a {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1853.863088] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5a54a12-d1a3-43f2-b883-be80d1b3aac4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.183548] env[67169]: DEBUG nova.network.neutron [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Successfully created port: 1ffc3460-69bd-4609-9536-70941460a462 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1854.602543] env[67169]: DEBUG nova.network.neutron [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Successfully updated port: b2a986bc-e8e6-429b-b50f-13ead41b643a {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1854.615366] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "refresh_cache-115d6c00-4259-4e87-aa00-90b576a63535" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1854.615514] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "refresh_cache-115d6c00-4259-4e87-aa00-90b576a63535" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1854.615665] env[67169]: DEBUG nova.network.neutron [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1854.651947] env[67169]: DEBUG nova.network.neutron [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1854.767855] env[67169]: DEBUG nova.compute.manager [req-c3a554f8-0a49-4309-bd5f-fdba3ef397f5 req-00b90efd-ec12-43c3-8d55-1f9d2bd5d896 service nova] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Received event network-vif-plugged-b2a986bc-e8e6-429b-b50f-13ead41b643a {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1854.767855] env[67169]: DEBUG oslo_concurrency.lockutils [req-c3a554f8-0a49-4309-bd5f-fdba3ef397f5 req-00b90efd-ec12-43c3-8d55-1f9d2bd5d896 service nova] Acquiring lock "115d6c00-4259-4e87-aa00-90b576a63535-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1854.767855] env[67169]: DEBUG oslo_concurrency.lockutils [req-c3a554f8-0a49-4309-bd5f-fdba3ef397f5 req-00b90efd-ec12-43c3-8d55-1f9d2bd5d896 service nova] Lock "115d6c00-4259-4e87-aa00-90b576a63535-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1854.767855] env[67169]: DEBUG oslo_concurrency.lockutils [req-c3a554f8-0a49-4309-bd5f-fdba3ef397f5 req-00b90efd-ec12-43c3-8d55-1f9d2bd5d896 service nova] Lock "115d6c00-4259-4e87-aa00-90b576a63535-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1854.767855] env[67169]: DEBUG nova.compute.manager [req-c3a554f8-0a49-4309-bd5f-fdba3ef397f5 req-00b90efd-ec12-43c3-8d55-1f9d2bd5d896 service nova] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] No waiting events found dispatching network-vif-plugged-b2a986bc-e8e6-429b-b50f-13ead41b643a {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1854.768050] env[67169]: WARNING nova.compute.manager [req-c3a554f8-0a49-4309-bd5f-fdba3ef397f5 req-00b90efd-ec12-43c3-8d55-1f9d2bd5d896 service nova] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Received unexpected event network-vif-plugged-b2a986bc-e8e6-429b-b50f-13ead41b643a for instance with vm_state building and task_state spawning. [ 1854.826847] env[67169]: DEBUG nova.network.neutron [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Updating instance_info_cache with network_info: [{"id": "b2a986bc-e8e6-429b-b50f-13ead41b643a", "address": "fa:16:3e:bc:b8:55", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2a986bc-e8", "ovs_interfaceid": "b2a986bc-e8e6-429b-b50f-13ead41b643a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1854.841017] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "refresh_cache-115d6c00-4259-4e87-aa00-90b576a63535" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1854.841372] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Instance network_info: |[{"id": "b2a986bc-e8e6-429b-b50f-13ead41b643a", "address": "fa:16:3e:bc:b8:55", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2a986bc-e8", "ovs_interfaceid": "b2a986bc-e8e6-429b-b50f-13ead41b643a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1854.841826] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bc:b8:55', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '56398cc0-e39f-410f-8036-8c2a6870e26f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b2a986bc-e8e6-429b-b50f-13ead41b643a', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1854.849582] env[67169]: DEBUG oslo.service.loopingcall [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1854.850109] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1854.850400] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-72766978-0aa3-495b-82b9-5181bfc7c182 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.870977] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1854.870977] env[67169]: value = "task-2819245" [ 1854.870977] env[67169]: _type = "Task" [ 1854.870977] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1854.878623] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819245, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1855.210865] env[67169]: DEBUG nova.compute.manager [req-cd42e8eb-81d9-43b2-b504-1c793114f5b9 req-28b6656f-5230-4079-9b5e-031808616bb4 service nova] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Received event network-vif-plugged-1ffc3460-69bd-4609-9536-70941460a462 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1855.211104] env[67169]: DEBUG oslo_concurrency.lockutils [req-cd42e8eb-81d9-43b2-b504-1c793114f5b9 req-28b6656f-5230-4079-9b5e-031808616bb4 service nova] Acquiring lock "220daf5b-b4fd-49b0-9098-c1f846d6e552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1855.211319] env[67169]: DEBUG oslo_concurrency.lockutils [req-cd42e8eb-81d9-43b2-b504-1c793114f5b9 req-28b6656f-5230-4079-9b5e-031808616bb4 service nova] Lock "220daf5b-b4fd-49b0-9098-c1f846d6e552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1855.211490] env[67169]: DEBUG oslo_concurrency.lockutils [req-cd42e8eb-81d9-43b2-b504-1c793114f5b9 req-28b6656f-5230-4079-9b5e-031808616bb4 service nova] Lock "220daf5b-b4fd-49b0-9098-c1f846d6e552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1855.211661] env[67169]: DEBUG nova.compute.manager [req-cd42e8eb-81d9-43b2-b504-1c793114f5b9 req-28b6656f-5230-4079-9b5e-031808616bb4 service nova] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] No waiting events found dispatching network-vif-plugged-1ffc3460-69bd-4609-9536-70941460a462 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1855.211825] env[67169]: WARNING nova.compute.manager [req-cd42e8eb-81d9-43b2-b504-1c793114f5b9 req-28b6656f-5230-4079-9b5e-031808616bb4 service nova] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Received unexpected event network-vif-plugged-1ffc3460-69bd-4609-9536-70941460a462 for instance with vm_state building and task_state spawning. [ 1855.288112] env[67169]: DEBUG nova.network.neutron [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Successfully updated port: 1ffc3460-69bd-4609-9536-70941460a462 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1855.297555] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "refresh_cache-220daf5b-b4fd-49b0-9098-c1f846d6e552" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1855.297750] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "refresh_cache-220daf5b-b4fd-49b0-9098-c1f846d6e552" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1855.297844] env[67169]: DEBUG nova.network.neutron [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1855.333478] env[67169]: DEBUG nova.network.neutron [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1855.382802] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819245, 'name': CreateVM_Task, 'duration_secs': 0.27835} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1855.382960] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1855.389470] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1855.389653] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1855.389969] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1855.390223] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa8a9bdb-495e-4822-bcae-e7e91d281714 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.394626] env[67169]: DEBUG oslo_vmware.api [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 1855.394626] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52578095-c873-5671-63ab-f2472c81b676" [ 1855.394626] env[67169]: _type = "Task" [ 1855.394626] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1855.404036] env[67169]: DEBUG oslo_vmware.api [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52578095-c873-5671-63ab-f2472c81b676, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1855.485712] env[67169]: DEBUG nova.network.neutron [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Updating instance_info_cache with network_info: [{"id": "1ffc3460-69bd-4609-9536-70941460a462", "address": "fa:16:3e:59:d4:56", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ffc3460-69", "ovs_interfaceid": "1ffc3460-69bd-4609-9536-70941460a462", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1855.499028] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "refresh_cache-220daf5b-b4fd-49b0-9098-c1f846d6e552" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1855.499028] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Instance network_info: |[{"id": "1ffc3460-69bd-4609-9536-70941460a462", "address": "fa:16:3e:59:d4:56", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ffc3460-69", "ovs_interfaceid": "1ffc3460-69bd-4609-9536-70941460a462", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1855.499216] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:59:d4:56', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '24210a23-d8ac-4f4f-84ac-dc0636de9a72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1ffc3460-69bd-4609-9536-70941460a462', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1855.506918] env[67169]: DEBUG oslo.service.loopingcall [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1855.507271] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1855.507526] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f508f2b0-36ee-4e80-8629-683226b1d82c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.527752] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1855.527752] env[67169]: value = "task-2819246" [ 1855.527752] env[67169]: _type = "Task" [ 1855.527752] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1855.535455] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819246, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1855.904599] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1855.904889] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1855.905078] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1856.037958] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819246, 'name': CreateVM_Task, 'duration_secs': 0.2793} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1856.038158] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1856.038819] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1856.038988] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1856.039310] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1856.039611] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fe8adca1-c5df-4599-b5da-7bc2377bf4f6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.043926] env[67169]: DEBUG oslo_vmware.api [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1856.043926] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5284126f-e759-051b-9f26-43fc3a4d9838" [ 1856.043926] env[67169]: _type = "Task" [ 1856.043926] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1856.051362] env[67169]: DEBUG oslo_vmware.api [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5284126f-e759-051b-9f26-43fc3a4d9838, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1856.554895] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1856.555104] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1856.555370] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1856.793197] env[67169]: DEBUG nova.compute.manager [req-fcd7ed4c-8f37-4d2b-b7f9-af2eb982052f req-0942bace-dfa1-43a8-841f-8b34a70db856 service nova] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Received event network-changed-b2a986bc-e8e6-429b-b50f-13ead41b643a {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1856.793326] env[67169]: DEBUG nova.compute.manager [req-fcd7ed4c-8f37-4d2b-b7f9-af2eb982052f req-0942bace-dfa1-43a8-841f-8b34a70db856 service nova] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Refreshing instance network info cache due to event network-changed-b2a986bc-e8e6-429b-b50f-13ead41b643a. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1856.793537] env[67169]: DEBUG oslo_concurrency.lockutils [req-fcd7ed4c-8f37-4d2b-b7f9-af2eb982052f req-0942bace-dfa1-43a8-841f-8b34a70db856 service nova] Acquiring lock "refresh_cache-115d6c00-4259-4e87-aa00-90b576a63535" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1856.793682] env[67169]: DEBUG oslo_concurrency.lockutils [req-fcd7ed4c-8f37-4d2b-b7f9-af2eb982052f req-0942bace-dfa1-43a8-841f-8b34a70db856 service nova] Acquired lock "refresh_cache-115d6c00-4259-4e87-aa00-90b576a63535" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1856.793841] env[67169]: DEBUG nova.network.neutron [req-fcd7ed4c-8f37-4d2b-b7f9-af2eb982052f req-0942bace-dfa1-43a8-841f-8b34a70db856 service nova] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Refreshing network info cache for port b2a986bc-e8e6-429b-b50f-13ead41b643a {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1857.118506] env[67169]: DEBUG nova.network.neutron [req-fcd7ed4c-8f37-4d2b-b7f9-af2eb982052f req-0942bace-dfa1-43a8-841f-8b34a70db856 service nova] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Updated VIF entry in instance network info cache for port b2a986bc-e8e6-429b-b50f-13ead41b643a. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1857.118932] env[67169]: DEBUG nova.network.neutron [req-fcd7ed4c-8f37-4d2b-b7f9-af2eb982052f req-0942bace-dfa1-43a8-841f-8b34a70db856 service nova] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Updating instance_info_cache with network_info: [{"id": "b2a986bc-e8e6-429b-b50f-13ead41b643a", "address": "fa:16:3e:bc:b8:55", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2a986bc-e8", "ovs_interfaceid": "b2a986bc-e8e6-429b-b50f-13ead41b643a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1857.128159] env[67169]: DEBUG oslo_concurrency.lockutils [req-fcd7ed4c-8f37-4d2b-b7f9-af2eb982052f req-0942bace-dfa1-43a8-841f-8b34a70db856 service nova] Releasing lock "refresh_cache-115d6c00-4259-4e87-aa00-90b576a63535" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1857.249658] env[67169]: DEBUG nova.compute.manager [req-355a4dee-355f-4e5b-9459-12fca20586ec req-31ad74b4-3713-40fb-8ceb-ab30e5ea163c service nova] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Received event network-changed-1ffc3460-69bd-4609-9536-70941460a462 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1857.249786] env[67169]: DEBUG nova.compute.manager [req-355a4dee-355f-4e5b-9459-12fca20586ec req-31ad74b4-3713-40fb-8ceb-ab30e5ea163c service nova] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Refreshing instance network info cache due to event network-changed-1ffc3460-69bd-4609-9536-70941460a462. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1857.250057] env[67169]: DEBUG oslo_concurrency.lockutils [req-355a4dee-355f-4e5b-9459-12fca20586ec req-31ad74b4-3713-40fb-8ceb-ab30e5ea163c service nova] Acquiring lock "refresh_cache-220daf5b-b4fd-49b0-9098-c1f846d6e552" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1857.250151] env[67169]: DEBUG oslo_concurrency.lockutils [req-355a4dee-355f-4e5b-9459-12fca20586ec req-31ad74b4-3713-40fb-8ceb-ab30e5ea163c service nova] Acquired lock "refresh_cache-220daf5b-b4fd-49b0-9098-c1f846d6e552" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1857.250312] env[67169]: DEBUG nova.network.neutron [req-355a4dee-355f-4e5b-9459-12fca20586ec req-31ad74b4-3713-40fb-8ceb-ab30e5ea163c service nova] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Refreshing network info cache for port 1ffc3460-69bd-4609-9536-70941460a462 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1857.523247] env[67169]: DEBUG nova.network.neutron [req-355a4dee-355f-4e5b-9459-12fca20586ec req-31ad74b4-3713-40fb-8ceb-ab30e5ea163c service nova] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Updated VIF entry in instance network info cache for port 1ffc3460-69bd-4609-9536-70941460a462. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1857.523612] env[67169]: DEBUG nova.network.neutron [req-355a4dee-355f-4e5b-9459-12fca20586ec req-31ad74b4-3713-40fb-8ceb-ab30e5ea163c service nova] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Updating instance_info_cache with network_info: [{"id": "1ffc3460-69bd-4609-9536-70941460a462", "address": "fa:16:3e:59:d4:56", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ffc3460-69", "ovs_interfaceid": "1ffc3460-69bd-4609-9536-70941460a462", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1857.532496] env[67169]: DEBUG oslo_concurrency.lockutils [req-355a4dee-355f-4e5b-9459-12fca20586ec req-31ad74b4-3713-40fb-8ceb-ab30e5ea163c service nova] Releasing lock "refresh_cache-220daf5b-b4fd-49b0-9098-c1f846d6e552" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1857.658423] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1859.658496] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1859.658820] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1859.658820] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1859.680826] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.680991] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.681140] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.681272] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.681397] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.681520] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.681644] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.681844] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.681991] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.682131] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1859.682256] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1862.658711] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1863.658579] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1864.662163] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1864.662163] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1865.654092] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1865.658856] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1867.654625] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1867.677917] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1867.688710] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1867.688929] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1867.689109] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1867.689269] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1867.690754] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6019a57-5158-467e-828d-fe667af8bdb9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1867.699465] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daf6a596-5ee2-4f26-9790-25108fe8a0a9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1867.714383] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-968181dc-a31d-455b-8429-32cdd40624f4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1867.720746] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ad4586a-80fa-4e1a-a494-441ea75ca16f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1867.749863] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181013MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1867.750010] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1867.750213] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1867.819152] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2e156908-c313-4229-840d-13ed8e6d4074 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.819320] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.819449] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.819600] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.819731] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.819849] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.819965] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.820100] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.820220] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 115d6c00-4259-4e87-aa00-90b576a63535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.820334] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1867.832094] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1867.832333] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1867.832479] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1867.971703] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-654a2376-f885-495f-837e-096b455d8d10 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1867.979460] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7ab430d-6f61-4643-821a-b8ce7463e41a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1868.008707] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-787a9ea3-c8ac-48fc-a554-f9590e6d9b0a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1868.015589] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00b70697-cb4f-4a52-bd1b-400acb2511e4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1868.028494] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1868.037807] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1868.052989] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1868.053128] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1869.034502] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1900.748996] env[67169]: WARNING oslo_vmware.rw_handles [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1900.748996] env[67169]: ERROR oslo_vmware.rw_handles [ 1900.750026] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/89170a22-a055-49f9-952e-85d7a22a5fb7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1900.751608] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1900.751848] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Copying Virtual Disk [datastore2] vmware_temp/89170a22-a055-49f9-952e-85d7a22a5fb7/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/89170a22-a055-49f9-952e-85d7a22a5fb7/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1900.752150] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-045d362a-ebfe-4870-af83-00db7aa844d5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1900.760746] env[67169]: DEBUG oslo_vmware.api [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Waiting for the task: (returnval){ [ 1900.760746] env[67169]: value = "task-2819247" [ 1900.760746] env[67169]: _type = "Task" [ 1900.760746] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1900.770996] env[67169]: DEBUG oslo_vmware.api [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Task: {'id': task-2819247, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1901.271171] env[67169]: DEBUG oslo_vmware.exceptions [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1901.271451] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1901.272017] env[67169]: ERROR nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1901.272017] env[67169]: Faults: ['InvalidArgument'] [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Traceback (most recent call last): [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] yield resources [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] self.driver.spawn(context, instance, image_meta, [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] self._fetch_image_if_missing(context, vi) [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] image_cache(vi, tmp_image_ds_loc) [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] vm_util.copy_virtual_disk( [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] session._wait_for_task(vmdk_copy_task) [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] return self.wait_for_task(task_ref) [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] return evt.wait() [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] result = hub.switch() [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] return self.greenlet.switch() [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] self.f(*self.args, **self.kw) [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] raise exceptions.translate_fault(task_info.error) [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Faults: ['InvalidArgument'] [ 1901.272017] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] [ 1901.272977] env[67169]: INFO nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Terminating instance [ 1901.273899] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1901.274123] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1901.274357] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-639b1ada-8fbe-4b5d-9c78-dda94db7a8b4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.276642] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1901.276844] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1901.277556] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-009aec87-06a9-463a-b1e8-5a9c2bea2e53 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.284319] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1901.285284] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-92752054-5774-43a3-b989-bad073499849 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.286609] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1901.286780] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1901.287439] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6b15a399-62d7-4e9f-91f6-87f45637ccec {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.292173] env[67169]: DEBUG oslo_vmware.api [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 1901.292173] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5225ff48-4878-1863-3715-7240d8a8cf75" [ 1901.292173] env[67169]: _type = "Task" [ 1901.292173] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1901.299195] env[67169]: DEBUG oslo_vmware.api [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5225ff48-4878-1863-3715-7240d8a8cf75, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1901.356664] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1901.356825] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1901.357019] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Deleting the datastore file [datastore2] 2e156908-c313-4229-840d-13ed8e6d4074 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1901.357297] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3e2d556d-b61e-4ade-b281-8eff16175d59 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.363125] env[67169]: DEBUG oslo_vmware.api [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Waiting for the task: (returnval){ [ 1901.363125] env[67169]: value = "task-2819249" [ 1901.363125] env[67169]: _type = "Task" [ 1901.363125] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1901.370503] env[67169]: DEBUG oslo_vmware.api [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Task: {'id': task-2819249, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1901.802303] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1901.802625] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating directory with path [datastore2] vmware_temp/92d20412-ceb8-4607-b4ab-ecc57137ec20/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1901.802771] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e7241136-8ee0-4d84-85fc-54962bd03496 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.815102] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Created directory with path [datastore2] vmware_temp/92d20412-ceb8-4607-b4ab-ecc57137ec20/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1901.815283] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Fetch image to [datastore2] vmware_temp/92d20412-ceb8-4607-b4ab-ecc57137ec20/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1901.815446] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/92d20412-ceb8-4607-b4ab-ecc57137ec20/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1901.816152] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bfbb4ae-e66e-4445-baa7-0e0c5c1d92f6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.822575] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95b01c28-c82a-4baf-bc0f-18dccb4f743c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.831389] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a601dc2-c3f0-4e5a-b610-bc92e8e67088 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.861172] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70ae6548-4dfe-4524-b530-694795ebe684 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.873191] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-609642bc-f61b-4012-81c7-cfe68e9339d1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1901.874923] env[67169]: DEBUG oslo_vmware.api [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Task: {'id': task-2819249, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068171} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1901.875204] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1901.875394] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1901.875561] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1901.875905] env[67169]: INFO nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1901.877807] env[67169]: DEBUG nova.compute.claims [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1901.878033] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1901.878263] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1901.896843] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1901.952319] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/92d20412-ceb8-4607-b4ab-ecc57137ec20/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1902.014040] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1902.014238] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/92d20412-ceb8-4607-b4ab-ecc57137ec20/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1902.100674] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bf266e0-7e67-48f0-9394-30a3ca6b7017 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.108515] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be3f8bc1-b7df-4cb3-82d3-7955bf4c1446 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.138456] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-449477cb-40fe-4ce2-9c24-30c16fdfd41c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.145196] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7b067c6-04bc-4ba9-b64a-cbe365cabcc8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.157673] env[67169]: DEBUG nova.compute.provider_tree [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1902.167126] env[67169]: DEBUG nova.scheduler.client.report [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1902.180149] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.302s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.180704] env[67169]: ERROR nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1902.180704] env[67169]: Faults: ['InvalidArgument'] [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Traceback (most recent call last): [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] self.driver.spawn(context, instance, image_meta, [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] self._fetch_image_if_missing(context, vi) [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] image_cache(vi, tmp_image_ds_loc) [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] vm_util.copy_virtual_disk( [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] session._wait_for_task(vmdk_copy_task) [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] return self.wait_for_task(task_ref) [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] return evt.wait() [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] result = hub.switch() [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] return self.greenlet.switch() [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] self.f(*self.args, **self.kw) [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] raise exceptions.translate_fault(task_info.error) [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Faults: ['InvalidArgument'] [ 1902.180704] env[67169]: ERROR nova.compute.manager [instance: 2e156908-c313-4229-840d-13ed8e6d4074] [ 1902.181830] env[67169]: DEBUG nova.compute.utils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1902.183066] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Build of instance 2e156908-c313-4229-840d-13ed8e6d4074 was re-scheduled: A specified parameter was not correct: fileType [ 1902.183066] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1902.183456] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1902.183633] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1902.183799] env[67169]: DEBUG nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1902.183960] env[67169]: DEBUG nova.network.neutron [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1902.544276] env[67169]: DEBUG nova.network.neutron [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1902.555165] env[67169]: INFO nova.compute.manager [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Took 0.37 seconds to deallocate network for instance. [ 1902.657664] env[67169]: INFO nova.scheduler.client.report [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Deleted allocations for instance 2e156908-c313-4229-840d-13ed8e6d4074 [ 1902.679345] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1f74b6c8-d5a6-4795-bf0a-f9fc6bd88668 tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "2e156908-c313-4229-840d-13ed8e6d4074" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.865s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.680602] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "2e156908-c313-4229-840d-13ed8e6d4074" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 434.490s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1902.680836] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Acquiring lock "2e156908-c313-4229-840d-13ed8e6d4074-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1902.681060] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "2e156908-c313-4229-840d-13ed8e6d4074-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1902.681235] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "2e156908-c313-4229-840d-13ed8e6d4074-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.683875] env[67169]: INFO nova.compute.manager [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Terminating instance [ 1902.686314] env[67169]: DEBUG nova.compute.manager [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1902.686314] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1902.686314] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-355df757-dc07-4aef-a95d-a03d6b2a7053 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.696641] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-339005ae-1688-43e9-878b-695cb52884cd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.707179] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1902.728365] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2e156908-c313-4229-840d-13ed8e6d4074 could not be found. [ 1902.728605] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1902.729159] env[67169]: INFO nova.compute.manager [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1902.729159] env[67169]: DEBUG oslo.service.loopingcall [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1902.729319] env[67169]: DEBUG nova.compute.manager [-] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1902.729423] env[67169]: DEBUG nova.network.neutron [-] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1902.763425] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1902.763742] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1902.765276] env[67169]: INFO nova.compute.claims [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1902.774759] env[67169]: DEBUG nova.network.neutron [-] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1902.791330] env[67169]: INFO nova.compute.manager [-] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] Took 0.06 seconds to deallocate network for instance. [ 1902.885610] env[67169]: DEBUG oslo_concurrency.lockutils [None req-2517fb03-dad8-414b-bb38-46ee103f363e tempest-ServersNegativeTestJSON-1601465401 tempest-ServersNegativeTestJSON-1601465401-project-member] Lock "2e156908-c313-4229-840d-13ed8e6d4074" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.205s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.886452] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "2e156908-c313-4229-840d-13ed8e6d4074" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 99.201s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1902.886637] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2e156908-c313-4229-840d-13ed8e6d4074] During sync_power_state the instance has a pending task (deleting). Skip. [ 1902.886808] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "2e156908-c313-4229-840d-13ed8e6d4074" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1902.933435] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64979593-a9d0-4f64-8009-32c45385956f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.941106] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91365905-e1ef-482a-8fa1-7fec714a53d6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.971680] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f50327d-29fd-4854-a5df-c2ac50a976bd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.978549] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55bc2413-cef5-46d0-9c81-b559dd823b7e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.991340] env[67169]: DEBUG nova.compute.provider_tree [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1903.000593] env[67169]: DEBUG nova.scheduler.client.report [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1903.014241] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1903.014788] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1903.052121] env[67169]: DEBUG nova.compute.utils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1903.053694] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1903.053694] env[67169]: DEBUG nova.network.neutron [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1903.063375] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1903.123981] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1903.149049] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1903.149296] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1903.149457] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1903.149644] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1903.149791] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1903.149974] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1903.150295] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1903.150461] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1903.150687] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1903.150790] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1903.150979] env[67169]: DEBUG nova.virt.hardware [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1903.151839] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9316440-af84-488a-b841-c6f73aa82e7a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.160415] env[67169]: DEBUG nova.policy [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d789ec14c2b4d62be952753fb47f0f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '00d358bc61014b5cb3ddcdab7785e7e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1903.162777] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f551d758-2612-44fb-9901-ddc4dcb19c13 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.443948] env[67169]: DEBUG nova.network.neutron [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Successfully created port: 08ff7842-7887-466d-a734-616b3ea8bfcd {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1904.458201] env[67169]: DEBUG nova.network.neutron [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Successfully updated port: 08ff7842-7887-466d-a734-616b3ea8bfcd {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1904.469385] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "refresh_cache-68b94a43-eaa5-4023-8bf5-8cc647c2f098" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1904.469539] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "refresh_cache-68b94a43-eaa5-4023-8bf5-8cc647c2f098" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1904.469692] env[67169]: DEBUG nova.network.neutron [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1904.512799] env[67169]: DEBUG nova.network.neutron [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1904.607665] env[67169]: DEBUG nova.compute.manager [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Received event network-vif-plugged-08ff7842-7887-466d-a734-616b3ea8bfcd {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1904.607936] env[67169]: DEBUG oslo_concurrency.lockutils [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] Acquiring lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.608107] env[67169]: DEBUG oslo_concurrency.lockutils [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] Lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.608296] env[67169]: DEBUG oslo_concurrency.lockutils [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] Lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.608438] env[67169]: DEBUG nova.compute.manager [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] No waiting events found dispatching network-vif-plugged-08ff7842-7887-466d-a734-616b3ea8bfcd {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1904.608604] env[67169]: WARNING nova.compute.manager [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Received unexpected event network-vif-plugged-08ff7842-7887-466d-a734-616b3ea8bfcd for instance with vm_state building and task_state spawning. [ 1904.608762] env[67169]: DEBUG nova.compute.manager [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Received event network-changed-08ff7842-7887-466d-a734-616b3ea8bfcd {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1904.608916] env[67169]: DEBUG nova.compute.manager [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Refreshing instance network info cache due to event network-changed-08ff7842-7887-466d-a734-616b3ea8bfcd. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1904.609284] env[67169]: DEBUG oslo_concurrency.lockutils [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] Acquiring lock "refresh_cache-68b94a43-eaa5-4023-8bf5-8cc647c2f098" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1904.665215] env[67169]: DEBUG nova.network.neutron [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Updating instance_info_cache with network_info: [{"id": "08ff7842-7887-466d-a734-616b3ea8bfcd", "address": "fa:16:3e:a3:e5:84", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap08ff7842-78", "ovs_interfaceid": "08ff7842-7887-466d-a734-616b3ea8bfcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.678347] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "refresh_cache-68b94a43-eaa5-4023-8bf5-8cc647c2f098" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1904.678628] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Instance network_info: |[{"id": "08ff7842-7887-466d-a734-616b3ea8bfcd", "address": "fa:16:3e:a3:e5:84", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap08ff7842-78", "ovs_interfaceid": "08ff7842-7887-466d-a734-616b3ea8bfcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1904.678916] env[67169]: DEBUG oslo_concurrency.lockutils [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] Acquired lock "refresh_cache-68b94a43-eaa5-4023-8bf5-8cc647c2f098" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1904.679103] env[67169]: DEBUG nova.network.neutron [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Refreshing network info cache for port 08ff7842-7887-466d-a734-616b3ea8bfcd {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1904.680114] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a3:e5:84', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '20e3f794-c7a3-4696-9488-ecf34c570ef9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '08ff7842-7887-466d-a734-616b3ea8bfcd', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1904.687581] env[67169]: DEBUG oslo.service.loopingcall [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1904.690243] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1904.690679] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-51aa4468-801d-4d39-b5d4-6c54044d2915 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.714646] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1904.714646] env[67169]: value = "task-2819250" [ 1904.714646] env[67169]: _type = "Task" [ 1904.714646] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1904.723348] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819250, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1904.934592] env[67169]: DEBUG nova.network.neutron [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Updated VIF entry in instance network info cache for port 08ff7842-7887-466d-a734-616b3ea8bfcd. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1904.934970] env[67169]: DEBUG nova.network.neutron [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Updating instance_info_cache with network_info: [{"id": "08ff7842-7887-466d-a734-616b3ea8bfcd", "address": "fa:16:3e:a3:e5:84", "network": {"id": "ee7bdc29-2aab-4fc5-9b52-cee22ee0f249", "bridge": "br-int", "label": "tempest-ImagesTestJSON-634733000-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "00d358bc61014b5cb3ddcdab7785e7e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "20e3f794-c7a3-4696-9488-ecf34c570ef9", "external-id": "nsx-vlan-transportzone-509", "segmentation_id": 509, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap08ff7842-78", "ovs_interfaceid": "08ff7842-7887-466d-a734-616b3ea8bfcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.947512] env[67169]: DEBUG oslo_concurrency.lockutils [req-45d3766b-3a0c-4658-b158-7da7d1272a84 req-6a17aada-a38a-45d9-8ef9-b8eb36dc2f14 service nova] Releasing lock "refresh_cache-68b94a43-eaa5-4023-8bf5-8cc647c2f098" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1905.224703] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819250, 'name': CreateVM_Task, 'duration_secs': 0.287897} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1905.224903] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1905.225575] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1905.225747] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1905.226135] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1905.226434] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-06ba05a9-abbe-4dbf-b3d7-16d188ff2dea {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.230995] env[67169]: DEBUG oslo_vmware.api [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 1905.230995] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f5cf0d-7cb6-03fd-657b-90559dad30f1" [ 1905.230995] env[67169]: _type = "Task" [ 1905.230995] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1905.238424] env[67169]: DEBUG oslo_vmware.api [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f5cf0d-7cb6-03fd-657b-90559dad30f1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1905.741962] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1905.742276] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1905.742458] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1909.915109] env[67169]: DEBUG oslo_concurrency.lockutils [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "115d6c00-4259-4e87-aa00-90b576a63535" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1918.659260] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1921.659778] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1921.660307] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1921.660307] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1921.682565] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.682755] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.682842] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.682969] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.683110] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.683241] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.683357] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.683476] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.683593] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.683708] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1921.683830] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1923.660425] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1924.659633] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1926.659236] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1926.659647] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1927.654164] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1927.658809] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1928.659070] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1928.659463] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1928.671232] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1928.671447] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1928.671611] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1928.671766] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1928.672904] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44bab885-26eb-410b-b0bd-c117ac83a998 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.681698] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4045cae1-0b7d-49dd-a15b-5832bb8eafdc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.696248] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a864266-451a-4cc7-82d8-dd176a3aeb4e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.702177] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99509306-a21a-43de-b13b-8eac5e66ed82 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.730264] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181039MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1928.730421] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1928.730601] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1928.801043] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.801198] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.801321] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.801441] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.801557] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.801671] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.801784] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.801896] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 115d6c00-4259-4e87-aa00-90b576a63535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.802011] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.802132] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1928.802308] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1928.802444] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1928.909009] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05036b43-5f1b-486e-843f-43a4e7db943a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.916820] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ae718ce-6b7f-4fa7-a3fa-93d417d59df8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.947713] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b5c6dd0-5748-4a28-8411-addf0352674b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.954524] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-714b56e9-57a1-4f7d-a2d1-048376a206d9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.966986] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1928.975090] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1928.988626] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1928.988813] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1950.810677] env[67169]: WARNING oslo_vmware.rw_handles [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1950.810677] env[67169]: ERROR oslo_vmware.rw_handles [ 1950.811522] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/92d20412-ceb8-4607-b4ab-ecc57137ec20/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1950.813138] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1950.813408] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Copying Virtual Disk [datastore2] vmware_temp/92d20412-ceb8-4607-b4ab-ecc57137ec20/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/92d20412-ceb8-4607-b4ab-ecc57137ec20/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1950.813700] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3f22a4b3-ecd4-45e4-b7d7-b2bb0dce7ea2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.822339] env[67169]: DEBUG oslo_vmware.api [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 1950.822339] env[67169]: value = "task-2819251" [ 1950.822339] env[67169]: _type = "Task" [ 1950.822339] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1950.830248] env[67169]: DEBUG oslo_vmware.api [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': task-2819251, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1951.332456] env[67169]: DEBUG oslo_vmware.exceptions [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1951.332771] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1951.333321] env[67169]: ERROR nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1951.333321] env[67169]: Faults: ['InvalidArgument'] [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Traceback (most recent call last): [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] yield resources [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] self.driver.spawn(context, instance, image_meta, [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] self._fetch_image_if_missing(context, vi) [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] image_cache(vi, tmp_image_ds_loc) [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] vm_util.copy_virtual_disk( [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] session._wait_for_task(vmdk_copy_task) [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] return self.wait_for_task(task_ref) [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] return evt.wait() [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] result = hub.switch() [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] return self.greenlet.switch() [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] self.f(*self.args, **self.kw) [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] raise exceptions.translate_fault(task_info.error) [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Faults: ['InvalidArgument'] [ 1951.333321] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] [ 1951.334627] env[67169]: INFO nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Terminating instance [ 1951.335201] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1951.335409] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1951.335646] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-296ea83a-6b64-439a-9bbd-ba2a0043412c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.337818] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1951.338028] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1951.338781] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9aa6104-b9fa-4dce-9d0f-68118f9b3787 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.345742] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1951.345949] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5f30a404-bb9e-4ddc-b4d9-7f551db14893 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.348167] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1951.348365] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1951.349300] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7a9b1cea-cefe-41a5-9dd7-5aee0cc484af {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.353930] env[67169]: DEBUG oslo_vmware.api [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1951.353930] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]525a211d-dc21-2e39-5b0e-98e7ea760ee9" [ 1951.353930] env[67169]: _type = "Task" [ 1951.353930] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1951.361238] env[67169]: DEBUG oslo_vmware.api [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]525a211d-dc21-2e39-5b0e-98e7ea760ee9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1951.410298] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1951.410580] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1951.410764] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Deleting the datastore file [datastore2] 2d7d3386-9854-4bf1-a680-5aed0a2329cb {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1951.411049] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6b47cf92-a798-4272-a652-5fc841a3dcf5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.417508] env[67169]: DEBUG oslo_vmware.api [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 1951.417508] env[67169]: value = "task-2819253" [ 1951.417508] env[67169]: _type = "Task" [ 1951.417508] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1951.425018] env[67169]: DEBUG oslo_vmware.api [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': task-2819253, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1951.863999] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1951.864339] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating directory with path [datastore2] vmware_temp/c66dd816-4856-4c4a-b01f-92b1187fde34/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1951.864501] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c8a5a050-5530-40ce-a9c1-a8ca3a2b033a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.876023] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created directory with path [datastore2] vmware_temp/c66dd816-4856-4c4a-b01f-92b1187fde34/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1951.876023] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Fetch image to [datastore2] vmware_temp/c66dd816-4856-4c4a-b01f-92b1187fde34/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1951.876023] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/c66dd816-4856-4c4a-b01f-92b1187fde34/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1951.876756] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-705668d0-60d8-461a-bf93-150729c8e838 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.883077] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0455fe22-23c5-4dba-adcc-dbafa040e81b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.892853] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a26473f7-9ba0-47b2-9012-c3cecf1659cb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.924422] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e98c446d-2497-4889-b34e-7dd1b62bdaaf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.930832] env[67169]: DEBUG oslo_vmware.api [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': task-2819253, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069078} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1951.932221] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1951.932407] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1951.932581] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1951.932750] env[67169]: INFO nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1951.934473] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5328e830-907c-447e-b128-4d9a5c8205e7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.936289] env[67169]: DEBUG nova.compute.claims [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1951.936460] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1951.936672] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1951.957349] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1952.008218] env[67169]: DEBUG oslo_vmware.rw_handles [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c66dd816-4856-4c4a-b01f-92b1187fde34/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1952.068669] env[67169]: DEBUG oslo_vmware.rw_handles [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1952.068856] env[67169]: DEBUG oslo_vmware.rw_handles [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c66dd816-4856-4c4a-b01f-92b1187fde34/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1952.143856] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f72cb03-2a00-4eee-a529-03d69b3bae19 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.151420] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e367b81-607d-4e4d-872b-33bcc80f7667 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.182063] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66b48183-3986-48fd-96f1-7ca82f99d79a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.188860] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-783b4b54-0d29-444b-b76b-06aaa7fbaf0d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.201568] env[67169]: DEBUG nova.compute.provider_tree [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1952.209303] env[67169]: DEBUG nova.scheduler.client.report [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1952.223158] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.286s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.223662] env[67169]: ERROR nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.223662] env[67169]: Faults: ['InvalidArgument'] [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Traceback (most recent call last): [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] self.driver.spawn(context, instance, image_meta, [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] self._fetch_image_if_missing(context, vi) [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] image_cache(vi, tmp_image_ds_loc) [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] vm_util.copy_virtual_disk( [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] session._wait_for_task(vmdk_copy_task) [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] return self.wait_for_task(task_ref) [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] return evt.wait() [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] result = hub.switch() [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] return self.greenlet.switch() [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] self.f(*self.args, **self.kw) [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] raise exceptions.translate_fault(task_info.error) [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Faults: ['InvalidArgument'] [ 1952.223662] env[67169]: ERROR nova.compute.manager [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] [ 1952.224724] env[67169]: DEBUG nova.compute.utils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1952.225637] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Build of instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb was re-scheduled: A specified parameter was not correct: fileType [ 1952.225637] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1952.226017] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1952.226196] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1952.226377] env[67169]: DEBUG nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1952.226540] env[67169]: DEBUG nova.network.neutron [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1952.589603] env[67169]: DEBUG nova.network.neutron [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1952.607427] env[67169]: INFO nova.compute.manager [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Took 0.38 seconds to deallocate network for instance. [ 1952.713855] env[67169]: INFO nova.scheduler.client.report [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Deleted allocations for instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb [ 1952.732790] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ae9f4c45-189f-4d85-9ca5-c9ae8ec39dac tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 631.144s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.733071] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 435.302s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.733298] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1952.733504] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.733674] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.735958] env[67169]: INFO nova.compute.manager [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Terminating instance [ 1952.737702] env[67169]: DEBUG nova.compute.manager [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1952.737940] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1952.738492] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-efb4bdb8-ff57-4ff7-9778-ce9bcf8176cb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.748427] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eef25a08-530f-48eb-974a-f679c4cfd244 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.778270] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2d7d3386-9854-4bf1-a680-5aed0a2329cb could not be found. [ 1952.778506] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1952.778688] env[67169]: INFO nova.compute.manager [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1952.778947] env[67169]: DEBUG oslo.service.loopingcall [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1952.779202] env[67169]: DEBUG nova.compute.manager [-] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1952.779294] env[67169]: DEBUG nova.network.neutron [-] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1952.812584] env[67169]: DEBUG nova.network.neutron [-] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1952.824140] env[67169]: INFO nova.compute.manager [-] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] Took 0.04 seconds to deallocate network for instance. [ 1952.909059] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a69770ea-adae-497b-8b48-a68ce9122239 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.176s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.910030] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 149.224s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.910030] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 2d7d3386-9854-4bf1-a680-5aed0a2329cb] During sync_power_state the instance has a pending task (deleting). Skip. [ 1952.910229] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "2d7d3386-9854-4bf1-a680-5aed0a2329cb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1965.209160] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "220daf5b-b4fd-49b0-9098-c1f846d6e552" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1972.403952] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquiring lock "1ece950d-8b7f-4462-8138-10cbf43149ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1972.403952] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Lock "1ece950d-8b7f-4462-8138-10cbf43149ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1972.417835] env[67169]: DEBUG nova.compute.manager [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1972.473979] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1972.474290] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1972.475818] env[67169]: INFO nova.compute.claims [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1972.669644] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d30d2e5-5e32-4af1-ae31-efc8c4f181c5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1972.677770] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20271a4a-052f-41e0-a370-575b3ca535ef {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1972.708346] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60db4a67-6d5f-4b1e-9c3f-693f8e612db5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1972.715009] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-701a5d5f-3e3c-4c4e-91cd-4b1072bb85db {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1972.727711] env[67169]: DEBUG nova.compute.provider_tree [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1972.737068] env[67169]: DEBUG nova.scheduler.client.report [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1972.750330] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1972.750817] env[67169]: DEBUG nova.compute.manager [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1972.781840] env[67169]: DEBUG nova.compute.utils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1972.783175] env[67169]: DEBUG nova.compute.manager [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1972.783343] env[67169]: DEBUG nova.network.neutron [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1972.791432] env[67169]: DEBUG nova.compute.manager [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1972.836412] env[67169]: DEBUG nova.policy [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fba15eb01b7040568859ef174e55cecd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f9bee6443cfb4d0da430cf995b9d9968', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 1972.852436] env[67169]: DEBUG nova.compute.manager [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1972.878715] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1972.878965] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1972.879139] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1972.879323] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1972.879473] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1972.879656] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1972.879878] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1972.880048] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1972.880218] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1972.880379] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1972.880595] env[67169]: DEBUG nova.virt.hardware [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1972.881486] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c2c0de-05d0-49c9-bca3-c62e3b00edc3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1972.889463] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1949647-4fe0-45d5-98e5-766d439207d5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.156176] env[67169]: DEBUG nova.network.neutron [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Successfully created port: a5800835-6a64-4d11-8c46-bc1f47539310 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1973.840139] env[67169]: DEBUG nova.compute.manager [req-aa57a529-7e91-4a3b-b35d-0d3b677b4ff9 req-de13dffb-d757-4cad-b13c-4d605253ab29 service nova] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Received event network-vif-plugged-a5800835-6a64-4d11-8c46-bc1f47539310 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1973.840397] env[67169]: DEBUG oslo_concurrency.lockutils [req-aa57a529-7e91-4a3b-b35d-0d3b677b4ff9 req-de13dffb-d757-4cad-b13c-4d605253ab29 service nova] Acquiring lock "1ece950d-8b7f-4462-8138-10cbf43149ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1973.840696] env[67169]: DEBUG oslo_concurrency.lockutils [req-aa57a529-7e91-4a3b-b35d-0d3b677b4ff9 req-de13dffb-d757-4cad-b13c-4d605253ab29 service nova] Lock "1ece950d-8b7f-4462-8138-10cbf43149ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1973.840898] env[67169]: DEBUG oslo_concurrency.lockutils [req-aa57a529-7e91-4a3b-b35d-0d3b677b4ff9 req-de13dffb-d757-4cad-b13c-4d605253ab29 service nova] Lock "1ece950d-8b7f-4462-8138-10cbf43149ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1973.841304] env[67169]: DEBUG nova.compute.manager [req-aa57a529-7e91-4a3b-b35d-0d3b677b4ff9 req-de13dffb-d757-4cad-b13c-4d605253ab29 service nova] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] No waiting events found dispatching network-vif-plugged-a5800835-6a64-4d11-8c46-bc1f47539310 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1973.841558] env[67169]: WARNING nova.compute.manager [req-aa57a529-7e91-4a3b-b35d-0d3b677b4ff9 req-de13dffb-d757-4cad-b13c-4d605253ab29 service nova] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Received unexpected event network-vif-plugged-a5800835-6a64-4d11-8c46-bc1f47539310 for instance with vm_state building and task_state spawning. [ 1973.926713] env[67169]: DEBUG nova.network.neutron [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Successfully updated port: a5800835-6a64-4d11-8c46-bc1f47539310 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1973.939334] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquiring lock "refresh_cache-1ece950d-8b7f-4462-8138-10cbf43149ee" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1973.939510] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquired lock "refresh_cache-1ece950d-8b7f-4462-8138-10cbf43149ee" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1973.939692] env[67169]: DEBUG nova.network.neutron [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1973.980269] env[67169]: DEBUG nova.network.neutron [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1974.320917] env[67169]: DEBUG nova.network.neutron [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Updating instance_info_cache with network_info: [{"id": "a5800835-6a64-4d11-8c46-bc1f47539310", "address": "fa:16:3e:bf:fe:74", "network": {"id": "170d734e-6b36-4d38-b22b-3ba1f598f8e2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1449120054-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f9bee6443cfb4d0da430cf995b9d9968", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e59b364d-b7f6-499d-b7dc-82b8a819aa12", "external-id": "nsx-vlan-transportzone-731", "segmentation_id": 731, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa5800835-6a", "ovs_interfaceid": "a5800835-6a64-4d11-8c46-bc1f47539310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1974.334133] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Releasing lock "refresh_cache-1ece950d-8b7f-4462-8138-10cbf43149ee" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1974.334438] env[67169]: DEBUG nova.compute.manager [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Instance network_info: |[{"id": "a5800835-6a64-4d11-8c46-bc1f47539310", "address": "fa:16:3e:bf:fe:74", "network": {"id": "170d734e-6b36-4d38-b22b-3ba1f598f8e2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1449120054-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f9bee6443cfb4d0da430cf995b9d9968", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e59b364d-b7f6-499d-b7dc-82b8a819aa12", "external-id": "nsx-vlan-transportzone-731", "segmentation_id": 731, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa5800835-6a", "ovs_interfaceid": "a5800835-6a64-4d11-8c46-bc1f47539310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1974.334850] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bf:fe:74', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e59b364d-b7f6-499d-b7dc-82b8a819aa12', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a5800835-6a64-4d11-8c46-bc1f47539310', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1974.342485] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Creating folder: Project (f9bee6443cfb4d0da430cf995b9d9968). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1974.343020] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ee95479c-7521-4b24-b8d4-fe4bfb64c511 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.354411] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Created folder: Project (f9bee6443cfb4d0da430cf995b9d9968) in parent group-v566843. [ 1974.354589] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Creating folder: Instances. Parent ref: group-v566948. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1974.354876] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1f8434d9-d4e1-4139-8781-4f8402ee5b04 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.363947] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Created folder: Instances in parent group-v566948. [ 1974.364167] env[67169]: DEBUG oslo.service.loopingcall [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1974.364346] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1974.364534] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8b1237cf-a03a-41d2-84a5-8b2558fa3b8f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.384752] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1974.384752] env[67169]: value = "task-2819256" [ 1974.384752] env[67169]: _type = "Task" [ 1974.384752] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1974.392246] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819256, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1974.894525] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819256, 'name': CreateVM_Task, 'duration_secs': 0.271229} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1974.894817] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1974.895365] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1974.895538] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1974.895869] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1974.896136] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4ef7862-20ac-4228-bbac-d48c3e07ffff {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.900488] env[67169]: DEBUG oslo_vmware.api [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Waiting for the task: (returnval){ [ 1974.900488] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5276ac65-7538-3556-0fbe-0820cc3f71dd" [ 1974.900488] env[67169]: _type = "Task" [ 1974.900488] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1974.907960] env[67169]: DEBUG oslo_vmware.api [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5276ac65-7538-3556-0fbe-0820cc3f71dd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1975.411871] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1975.412181] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1975.412416] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1975.949153] env[67169]: DEBUG nova.compute.manager [req-f668bded-90db-4bd4-b296-67d2971b1623 req-2bc434c1-f9b1-4495-a334-62f51c149f07 service nova] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Received event network-changed-a5800835-6a64-4d11-8c46-bc1f47539310 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1975.949400] env[67169]: DEBUG nova.compute.manager [req-f668bded-90db-4bd4-b296-67d2971b1623 req-2bc434c1-f9b1-4495-a334-62f51c149f07 service nova] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Refreshing instance network info cache due to event network-changed-a5800835-6a64-4d11-8c46-bc1f47539310. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1975.949516] env[67169]: DEBUG oslo_concurrency.lockutils [req-f668bded-90db-4bd4-b296-67d2971b1623 req-2bc434c1-f9b1-4495-a334-62f51c149f07 service nova] Acquiring lock "refresh_cache-1ece950d-8b7f-4462-8138-10cbf43149ee" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1975.949648] env[67169]: DEBUG oslo_concurrency.lockutils [req-f668bded-90db-4bd4-b296-67d2971b1623 req-2bc434c1-f9b1-4495-a334-62f51c149f07 service nova] Acquired lock "refresh_cache-1ece950d-8b7f-4462-8138-10cbf43149ee" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1975.949811] env[67169]: DEBUG nova.network.neutron [req-f668bded-90db-4bd4-b296-67d2971b1623 req-2bc434c1-f9b1-4495-a334-62f51c149f07 service nova] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Refreshing network info cache for port a5800835-6a64-4d11-8c46-bc1f47539310 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1976.261118] env[67169]: DEBUG nova.network.neutron [req-f668bded-90db-4bd4-b296-67d2971b1623 req-2bc434c1-f9b1-4495-a334-62f51c149f07 service nova] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Updated VIF entry in instance network info cache for port a5800835-6a64-4d11-8c46-bc1f47539310. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1976.261476] env[67169]: DEBUG nova.network.neutron [req-f668bded-90db-4bd4-b296-67d2971b1623 req-2bc434c1-f9b1-4495-a334-62f51c149f07 service nova] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Updating instance_info_cache with network_info: [{"id": "a5800835-6a64-4d11-8c46-bc1f47539310", "address": "fa:16:3e:bf:fe:74", "network": {"id": "170d734e-6b36-4d38-b22b-3ba1f598f8e2", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1449120054-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f9bee6443cfb4d0da430cf995b9d9968", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e59b364d-b7f6-499d-b7dc-82b8a819aa12", "external-id": "nsx-vlan-transportzone-731", "segmentation_id": 731, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa5800835-6a", "ovs_interfaceid": "a5800835-6a64-4d11-8c46-bc1f47539310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1976.270689] env[67169]: DEBUG oslo_concurrency.lockutils [req-f668bded-90db-4bd4-b296-67d2971b1623 req-2bc434c1-f9b1-4495-a334-62f51c149f07 service nova] Releasing lock "refresh_cache-1ece950d-8b7f-4462-8138-10cbf43149ee" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1979.989652] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1982.659594] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1982.659891] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1982.659932] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1982.681947] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.682294] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.682458] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.682592] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.682724] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.682848] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.683017] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.683161] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.683285] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.683406] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1982.683526] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1985.659156] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1985.659600] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1986.659321] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1986.659593] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1987.654565] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1987.658156] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1988.654602] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1989.658837] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1989.659338] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1989.670722] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1989.670938] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1989.671126] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1989.671291] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1989.672416] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b96e8e0-ac45-4148-8787-2b9426ee2b95 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.681300] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a0dafc0-854e-42d7-906d-bf3d11bb6a69 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.695753] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c58753b1-5aae-4c3a-9af0-aea0bd1c4b6f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.701994] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ce96154-5aaf-4446-b0ca-a65c5a15f5a0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.730302] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181024MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1989.730406] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1989.730588] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1989.803353] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.803353] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.803353] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.803565] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.803565] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.803651] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.803741] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 115d6c00-4259-4e87-aa00-90b576a63535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.803887] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.804028] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.804152] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1ece950d-8b7f-4462-8138-10cbf43149ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1989.804339] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1989.804475] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1989.925967] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c07926b4-3aef-40ee-9bf7-736df4becabd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.933552] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-491bb8cb-b619-472d-bbe2-159ab2613529 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.963811] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01f80403-b6a8-4750-9e53-400aa0078b05 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.970728] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-104715f9-4c6b-4d1b-9f8e-1705378718b8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.983510] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1989.991321] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1990.010494] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1990.010722] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.280s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1999.520665] env[67169]: WARNING oslo_vmware.rw_handles [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1999.520665] env[67169]: ERROR oslo_vmware.rw_handles [ 1999.521600] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/c66dd816-4856-4c4a-b01f-92b1187fde34/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1999.523061] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1999.523304] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Copying Virtual Disk [datastore2] vmware_temp/c66dd816-4856-4c4a-b01f-92b1187fde34/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/c66dd816-4856-4c4a-b01f-92b1187fde34/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1999.523577] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4731185d-3623-4aa9-8b77-091d037cf521 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1999.531530] env[67169]: DEBUG oslo_vmware.api [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 1999.531530] env[67169]: value = "task-2819257" [ 1999.531530] env[67169]: _type = "Task" [ 1999.531530] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1999.539020] env[67169]: DEBUG oslo_vmware.api [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819257, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2000.041824] env[67169]: DEBUG oslo_vmware.exceptions [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2000.042133] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2000.042706] env[67169]: ERROR nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2000.042706] env[67169]: Faults: ['InvalidArgument'] [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Traceback (most recent call last): [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] yield resources [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] self.driver.spawn(context, instance, image_meta, [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] self._fetch_image_if_missing(context, vi) [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] image_cache(vi, tmp_image_ds_loc) [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] vm_util.copy_virtual_disk( [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] session._wait_for_task(vmdk_copy_task) [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] return self.wait_for_task(task_ref) [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] return evt.wait() [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] result = hub.switch() [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] return self.greenlet.switch() [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] self.f(*self.args, **self.kw) [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] raise exceptions.translate_fault(task_info.error) [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Faults: ['InvalidArgument'] [ 2000.042706] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] [ 2000.044236] env[67169]: INFO nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Terminating instance [ 2000.044656] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2000.044861] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2000.045118] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c7037297-5724-43e9-a3b1-df3e6684e34f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.047482] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2000.047677] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2000.048404] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3283b5b3-c737-4443-b4aa-170b7fb80bf3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.055278] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2000.056309] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4357591e-71f3-46d1-9a29-558bd6291c8a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.057643] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2000.057811] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2000.058470] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cfe54ac3-04b3-4e1a-a96d-5254f56cc50e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.063260] env[67169]: DEBUG oslo_vmware.api [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 2000.063260] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52d6a212-d5f9-63fd-cd69-ed865fb3d4ac" [ 2000.063260] env[67169]: _type = "Task" [ 2000.063260] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2000.070177] env[67169]: DEBUG oslo_vmware.api [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52d6a212-d5f9-63fd-cd69-ed865fb3d4ac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2000.118213] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2000.118447] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2000.118626] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleting the datastore file [datastore2] fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2000.118953] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-694ceb97-9e70-4049-9b5b-071f69b9171b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.128024] env[67169]: DEBUG oslo_vmware.api [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 2000.128024] env[67169]: value = "task-2819259" [ 2000.128024] env[67169]: _type = "Task" [ 2000.128024] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2000.134888] env[67169]: DEBUG oslo_vmware.api [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819259, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2000.573960] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2000.574366] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating directory with path [datastore2] vmware_temp/b0110b27-d42e-4d27-8a6d-bc88891ec07d/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2000.574491] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c648347-8177-45af-a889-293f64131e7b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.586892] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created directory with path [datastore2] vmware_temp/b0110b27-d42e-4d27-8a6d-bc88891ec07d/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2000.587114] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Fetch image to [datastore2] vmware_temp/b0110b27-d42e-4d27-8a6d-bc88891ec07d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2000.587256] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/b0110b27-d42e-4d27-8a6d-bc88891ec07d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2000.587961] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebd00500-6d49-4f52-8146-698f4bde1127 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.594456] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b27fd4e8-df4e-4cc3-aceb-69c67fd32a75 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.603155] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c926ccbc-a452-4b61-9021-90ea597a9d78 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.636357] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-905856be-51b1-4947-b4d9-c22b3c91f0ae {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.642964] env[67169]: DEBUG oslo_vmware.api [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819259, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069078} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2000.644364] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2000.644554] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2000.644722] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2000.644893] env[67169]: INFO nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2000.646594] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ff57358b-0b65-4329-8b70-08645ff9528d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.648410] env[67169]: DEBUG nova.compute.claims [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2000.648575] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2000.648823] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2000.671086] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2000.721643] env[67169]: DEBUG oslo_vmware.rw_handles [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b0110b27-d42e-4d27-8a6d-bc88891ec07d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2000.781868] env[67169]: DEBUG oslo_vmware.rw_handles [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2000.782134] env[67169]: DEBUG oslo_vmware.rw_handles [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b0110b27-d42e-4d27-8a6d-bc88891ec07d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2000.857367] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4777243-7ebc-494b-bf40-9f9e29d25eac {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.864915] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b669067-16b4-4827-9c59-73c21f4461ce {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.895164] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd7ea91a-6cc7-40aa-a947-9f802f38df3e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.901622] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d48a47b-1bbe-4e40-8008-4b265ead2cc8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.914170] env[67169]: DEBUG nova.compute.provider_tree [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2000.922924] env[67169]: DEBUG nova.scheduler.client.report [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2000.937218] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.288s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2000.937723] env[67169]: ERROR nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2000.937723] env[67169]: Faults: ['InvalidArgument'] [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Traceback (most recent call last): [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] self.driver.spawn(context, instance, image_meta, [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] self._fetch_image_if_missing(context, vi) [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] image_cache(vi, tmp_image_ds_loc) [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] vm_util.copy_virtual_disk( [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] session._wait_for_task(vmdk_copy_task) [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] return self.wait_for_task(task_ref) [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] return evt.wait() [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] result = hub.switch() [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] return self.greenlet.switch() [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] self.f(*self.args, **self.kw) [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] raise exceptions.translate_fault(task_info.error) [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Faults: ['InvalidArgument'] [ 2000.937723] env[67169]: ERROR nova.compute.manager [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] [ 2000.938892] env[67169]: DEBUG nova.compute.utils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2000.939989] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Build of instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 was re-scheduled: A specified parameter was not correct: fileType [ 2000.939989] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2000.940453] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2000.940629] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2000.940822] env[67169]: DEBUG nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2000.940998] env[67169]: DEBUG nova.network.neutron [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2001.282964] env[67169]: DEBUG nova.network.neutron [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2001.295329] env[67169]: INFO nova.compute.manager [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Took 0.35 seconds to deallocate network for instance. [ 2001.406193] env[67169]: INFO nova.scheduler.client.report [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleted allocations for instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 [ 2001.434242] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8eb9a17a-f61b-4025-b929-3564b452eb9c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.415s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2001.434600] env[67169]: DEBUG oslo_concurrency.lockutils [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 429.179s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2001.434745] env[67169]: DEBUG oslo_concurrency.lockutils [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2001.434949] env[67169]: DEBUG oslo_concurrency.lockutils [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2001.435134] env[67169]: DEBUG oslo_concurrency.lockutils [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2001.440723] env[67169]: INFO nova.compute.manager [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Terminating instance [ 2001.443354] env[67169]: DEBUG nova.compute.manager [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2001.443547] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2001.444032] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0ed0bc8f-4bd6-4a75-8a0d-9921050c9b55 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.453753] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-264b493d-f27a-4b25-857b-b6d614669a6d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.482195] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fa24a4a8-895c-4ea6-8e0a-4ed1134beff0 could not be found. [ 2001.482396] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2001.482573] env[67169]: INFO nova.compute.manager [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2001.482823] env[67169]: DEBUG oslo.service.loopingcall [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2001.483059] env[67169]: DEBUG nova.compute.manager [-] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2001.483163] env[67169]: DEBUG nova.network.neutron [-] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2001.507935] env[67169]: DEBUG nova.network.neutron [-] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2001.516289] env[67169]: INFO nova.compute.manager [-] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] Took 0.03 seconds to deallocate network for instance. [ 2001.601626] env[67169]: DEBUG oslo_concurrency.lockutils [None req-23f66c57-c93d-4fe5-9ac1-37aeee8b87ae tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.167s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2001.602474] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 197.916s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2001.602664] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: fa24a4a8-895c-4ea6-8e0a-4ed1134beff0] During sync_power_state the instance has a pending task (deleting). Skip. [ 2001.602847] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "fa24a4a8-895c-4ea6-8e0a-4ed1134beff0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2009.386305] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2040.012279] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2044.661060] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2044.661420] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2044.661469] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2044.682200] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2044.682355] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2044.682496] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2044.682658] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2044.682792] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2044.682918] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2044.683050] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2044.683174] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2044.683292] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2044.683409] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2045.658998] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2046.666413] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2046.666818] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2047.220108] env[67169]: WARNING oslo_vmware.rw_handles [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2047.220108] env[67169]: ERROR oslo_vmware.rw_handles [ 2047.220644] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/b0110b27-d42e-4d27-8a6d-bc88891ec07d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2047.223628] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2047.223896] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Copying Virtual Disk [datastore2] vmware_temp/b0110b27-d42e-4d27-8a6d-bc88891ec07d/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/b0110b27-d42e-4d27-8a6d-bc88891ec07d/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2047.224180] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4f30729b-4825-4b37-abbf-9ccc18e8d848 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.232973] env[67169]: DEBUG oslo_vmware.api [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 2047.232973] env[67169]: value = "task-2819260" [ 2047.232973] env[67169]: _type = "Task" [ 2047.232973] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2047.240893] env[67169]: DEBUG oslo_vmware.api [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819260, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2047.659523] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2047.659864] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2047.743019] env[67169]: DEBUG oslo_vmware.exceptions [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2047.743375] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2047.743809] env[67169]: ERROR nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2047.743809] env[67169]: Faults: ['InvalidArgument'] [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Traceback (most recent call last): [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] yield resources [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] self.driver.spawn(context, instance, image_meta, [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] self._fetch_image_if_missing(context, vi) [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] image_cache(vi, tmp_image_ds_loc) [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] vm_util.copy_virtual_disk( [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] session._wait_for_task(vmdk_copy_task) [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] return self.wait_for_task(task_ref) [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] return evt.wait() [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] result = hub.switch() [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] return self.greenlet.switch() [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] self.f(*self.args, **self.kw) [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] raise exceptions.translate_fault(task_info.error) [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Faults: ['InvalidArgument'] [ 2047.743809] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] [ 2047.744931] env[67169]: INFO nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Terminating instance [ 2047.745680] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2047.745887] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2047.747027] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-90412660-edb2-46d5-8f62-53d571622243 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.748292] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2047.748483] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2047.749210] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e820e936-1d8d-45fe-83f5-315aa1803b5c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.755685] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2047.755891] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3f57b1c7-790a-4e3e-be99-b947d2540f43 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.758269] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2047.758445] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2047.759105] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-859b1e41-4f3e-4888-9f71-79c78e3e3248 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.763517] env[67169]: DEBUG oslo_vmware.api [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for the task: (returnval){ [ 2047.763517] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]522ba27b-c796-2f6f-eaf3-92c6f152d0f8" [ 2047.763517] env[67169]: _type = "Task" [ 2047.763517] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2047.775614] env[67169]: DEBUG oslo_vmware.api [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]522ba27b-c796-2f6f-eaf3-92c6f152d0f8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2047.817847] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2047.818193] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2047.818510] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleting the datastore file [datastore2] 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2047.818896] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bf4befab-dedc-46b8-b4c2-a6af4473ecbf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2047.835137] env[67169]: DEBUG oslo_vmware.api [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 2047.835137] env[67169]: value = "task-2819262" [ 2047.835137] env[67169]: _type = "Task" [ 2047.835137] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2047.844803] env[67169]: DEBUG oslo_vmware.api [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819262, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2048.273390] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2048.273657] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Creating directory with path [datastore2] vmware_temp/5868fdc6-4b57-4e0a-ad81-e1b721262127/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2048.273892] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f07bf287-597f-4143-a551-a948a7fca6b0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.286980] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Created directory with path [datastore2] vmware_temp/5868fdc6-4b57-4e0a-ad81-e1b721262127/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2048.287176] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Fetch image to [datastore2] vmware_temp/5868fdc6-4b57-4e0a-ad81-e1b721262127/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2048.287348] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/5868fdc6-4b57-4e0a-ad81-e1b721262127/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2048.288076] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cd63946-40f7-418a-8259-51abba867ee3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.294520] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf2bfb25-8023-44aa-813a-16dc9d75cd0d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.303371] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97e554ac-08e2-4977-be98-e00ffe3b07a4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.332966] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f095f2a-a514-4f6d-8b8e-ca7fd587291b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.339693] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-40174d6c-32cd-462b-893d-cf0b47aedcc0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.343866] env[67169]: DEBUG oslo_vmware.api [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819262, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.091859} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2048.344410] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2048.344588] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2048.344758] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2048.344926] env[67169]: INFO nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2048.346949] env[67169]: DEBUG nova.compute.claims [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2048.347128] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2048.347340] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2048.363038] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2048.499719] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4d6c01d-d9b9-45be-91ab-2bffd9ff090f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.503600] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5868fdc6-4b57-4e0a-ad81-e1b721262127/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2048.560495] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-881e2c94-f5cc-4776-953a-bc6235ef894a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.566299] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2048.566492] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5868fdc6-4b57-4e0a-ad81-e1b721262127/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2048.593526] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94c8658d-56b4-4a41-9132-fdfaab61633f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.601050] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2041fd2d-3347-48c2-a179-dca7cd4ad231 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2048.614018] env[67169]: DEBUG nova.compute.provider_tree [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2048.622338] env[67169]: DEBUG nova.scheduler.client.report [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2048.636060] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.289s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2048.636591] env[67169]: ERROR nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2048.636591] env[67169]: Faults: ['InvalidArgument'] [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Traceback (most recent call last): [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] self.driver.spawn(context, instance, image_meta, [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] self._fetch_image_if_missing(context, vi) [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] image_cache(vi, tmp_image_ds_loc) [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] vm_util.copy_virtual_disk( [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] session._wait_for_task(vmdk_copy_task) [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] return self.wait_for_task(task_ref) [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] return evt.wait() [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] result = hub.switch() [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] return self.greenlet.switch() [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] self.f(*self.args, **self.kw) [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] raise exceptions.translate_fault(task_info.error) [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Faults: ['InvalidArgument'] [ 2048.636591] env[67169]: ERROR nova.compute.manager [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] [ 2048.637491] env[67169]: DEBUG nova.compute.utils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2048.638633] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Build of instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 was re-scheduled: A specified parameter was not correct: fileType [ 2048.638633] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2048.639018] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2048.639211] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2048.639390] env[67169]: DEBUG nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2048.639554] env[67169]: DEBUG nova.network.neutron [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2048.653159] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2048.936046] env[67169]: DEBUG nova.network.neutron [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2048.947032] env[67169]: INFO nova.compute.manager [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Took 0.31 seconds to deallocate network for instance. [ 2049.035720] env[67169]: INFO nova.scheduler.client.report [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleted allocations for instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 [ 2049.064197] env[67169]: DEBUG oslo_concurrency.lockutils [None req-dad53a36-9ec2-447b-b9da-6ae8c0b39c52 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.436s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.064528] env[67169]: DEBUG oslo_concurrency.lockutils [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.616s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2049.064687] env[67169]: DEBUG oslo_concurrency.lockutils [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2049.064913] env[67169]: DEBUG oslo_concurrency.lockutils [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2049.065076] env[67169]: DEBUG oslo_concurrency.lockutils [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.068755] env[67169]: INFO nova.compute.manager [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Terminating instance [ 2049.071430] env[67169]: DEBUG nova.compute.manager [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2049.071650] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2049.072187] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ae9ac052-802a-458f-8973-1e45ef1d82f9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.082228] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17e749dd-8dab-48fc-81f2-88262b5ea957 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2049.113058] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 04d3ae51-f3f1-427b-ae45-279b02e4b3e6 could not be found. [ 2049.113313] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2049.113520] env[67169]: INFO nova.compute.manager [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2049.113771] env[67169]: DEBUG oslo.service.loopingcall [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2049.114028] env[67169]: DEBUG nova.compute.manager [-] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2049.114329] env[67169]: DEBUG nova.network.neutron [-] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2049.139101] env[67169]: DEBUG nova.network.neutron [-] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2049.147439] env[67169]: INFO nova.compute.manager [-] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] Took 0.03 seconds to deallocate network for instance. [ 2049.251826] env[67169]: DEBUG oslo_concurrency.lockutils [None req-00955648-a283-4a29-b129-a3849ea10a3d tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.187s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.252658] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 245.566s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2049.252844] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 04d3ae51-f3f1-427b-ae45-279b02e4b3e6] During sync_power_state the instance has a pending task (deleting). Skip. [ 2049.253030] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "04d3ae51-f3f1-427b-ae45-279b02e4b3e6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2049.658976] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2050.659449] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2051.659453] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2051.671396] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2051.671686] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2051.671901] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2051.672103] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2051.673251] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb6d5b91-9d93-4527-94e4-00345e3f9be1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.682118] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b672438-9cc0-4b72-9efd-84e347ab19a8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.696775] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-574adb51-aa7f-4845-be4b-c68f3020c296 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.703468] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9e540e0-0240-450c-a80a-f57ed11d3f09 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.733223] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181039MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2051.733439] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2051.733585] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2051.799640] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance aedbfde6-26e1-410d-a311-e2c344f65062 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2051.799798] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2051.799930] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2051.800077] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2051.800200] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 115d6c00-4259-4e87-aa00-90b576a63535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2051.800319] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2051.800437] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2051.800552] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1ece950d-8b7f-4462-8138-10cbf43149ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2051.800746] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2051.800882] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2051.920058] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcd82c8e-e372-46d6-855f-a75a8b03a9f5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.928342] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bb71e2e-998b-4d22-9f7f-289a7184100c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.960770] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6ff8c87-b2bb-4bb8-a958-1fd71530022f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.968589] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f2e9b78-0627-47ec-a089-2c64a797effb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.982369] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2051.991713] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2052.005897] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2052.006136] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2054.659068] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2054.659518] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2054.669088] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] There are 0 instances to clean {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2057.659491] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2057.659890] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances with incomplete migration {{(pid=67169) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2097.237640] env[67169]: WARNING oslo_vmware.rw_handles [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2097.237640] env[67169]: ERROR oslo_vmware.rw_handles [ 2097.238473] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/5868fdc6-4b57-4e0a-ad81-e1b721262127/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2097.240261] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2097.240539] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Copying Virtual Disk [datastore2] vmware_temp/5868fdc6-4b57-4e0a-ad81-e1b721262127/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/5868fdc6-4b57-4e0a-ad81-e1b721262127/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2097.240840] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f2adc0cf-5e14-410d-a9b0-6b4f22fb98ba {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.249181] env[67169]: DEBUG oslo_vmware.api [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for the task: (returnval){ [ 2097.249181] env[67169]: value = "task-2819263" [ 2097.249181] env[67169]: _type = "Task" [ 2097.249181] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2097.258600] env[67169]: DEBUG oslo_vmware.api [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': task-2819263, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2097.759908] env[67169]: DEBUG oslo_vmware.exceptions [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2097.760221] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2097.760778] env[67169]: ERROR nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2097.760778] env[67169]: Faults: ['InvalidArgument'] [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Traceback (most recent call last): [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] yield resources [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] self.driver.spawn(context, instance, image_meta, [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] self._fetch_image_if_missing(context, vi) [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] image_cache(vi, tmp_image_ds_loc) [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] vm_util.copy_virtual_disk( [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] session._wait_for_task(vmdk_copy_task) [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] return self.wait_for_task(task_ref) [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] return evt.wait() [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] result = hub.switch() [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] return self.greenlet.switch() [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] self.f(*self.args, **self.kw) [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] raise exceptions.translate_fault(task_info.error) [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Faults: ['InvalidArgument'] [ 2097.760778] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] [ 2097.762028] env[67169]: INFO nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Terminating instance [ 2097.762623] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2097.762834] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2097.763091] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a04b8add-1654-489a-8f95-8f4a378f2fbb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.765315] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2097.765509] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2097.766231] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6413f8e9-7c4e-4164-91a9-8b930b6cb6e8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.773277] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2097.773499] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9cf7b5ce-c1e3-4405-a6bf-323553bb5def {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.775545] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2097.775720] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2097.776666] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5b65b7fe-0493-4ae7-97b7-c89546aac140 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.781083] env[67169]: DEBUG oslo_vmware.api [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Waiting for the task: (returnval){ [ 2097.781083] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5216b085-2f30-7065-6bb7-d92f1fd2d4ab" [ 2097.781083] env[67169]: _type = "Task" [ 2097.781083] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2097.794708] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2097.794934] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Creating directory with path [datastore2] vmware_temp/d1cb0960-bfd3-4e9a-ab0e-c5e8cb669037/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2097.795185] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-17e3a99b-3976-4ed5-8383-aaa30b017032 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.814266] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Created directory with path [datastore2] vmware_temp/d1cb0960-bfd3-4e9a-ab0e-c5e8cb669037/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2097.814467] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Fetch image to [datastore2] vmware_temp/d1cb0960-bfd3-4e9a-ab0e-c5e8cb669037/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2097.814641] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/d1cb0960-bfd3-4e9a-ab0e-c5e8cb669037/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2097.815447] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-239a67b5-8332-4c12-b8e0-66149f4ad351 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.822304] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cf1c595-81dc-437a-a9c9-0ee90ba34d68 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.831332] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49a4e48d-3714-4a23-8127-ae83fb2f03de {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.862457] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34496cc8-b5fe-42de-80b8-78b69f4cbea2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.865008] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2097.865217] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2097.865405] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Deleting the datastore file [datastore2] aedbfde6-26e1-410d-a311-e2c344f65062 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2097.865630] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3f9ec69c-6ee3-4d9e-b6d7-c504bcc078f8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.871796] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-422004b2-0ba1-4da2-8779-8772901b0d40 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2097.873496] env[67169]: DEBUG oslo_vmware.api [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for the task: (returnval){ [ 2097.873496] env[67169]: value = "task-2819265" [ 2097.873496] env[67169]: _type = "Task" [ 2097.873496] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2097.881282] env[67169]: DEBUG oslo_vmware.api [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': task-2819265, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2097.897463] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2097.949322] env[67169]: DEBUG oslo_vmware.rw_handles [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d1cb0960-bfd3-4e9a-ab0e-c5e8cb669037/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2098.007489] env[67169]: DEBUG oslo_vmware.rw_handles [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2098.007659] env[67169]: DEBUG oslo_vmware.rw_handles [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d1cb0960-bfd3-4e9a-ab0e-c5e8cb669037/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2098.384344] env[67169]: DEBUG oslo_vmware.api [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Task: {'id': task-2819265, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066566} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2098.384631] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2098.384786] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2098.384963] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2098.385151] env[67169]: INFO nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2098.387245] env[67169]: DEBUG nova.compute.claims [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2098.387419] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2098.387635] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2098.533766] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebda9ca9-613c-4d9c-ae35-b2567006ad98 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.540861] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28ba6bd0-d9f2-4b16-9c3b-aace6d9a1133 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.570135] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0733e11-b763-4c13-9b34-939a27796710 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.576611] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d309815-0bc5-4fdb-a354-92ca803067c3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2098.588927] env[67169]: DEBUG nova.compute.provider_tree [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2098.597386] env[67169]: DEBUG nova.scheduler.client.report [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2098.610492] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.223s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2098.611037] env[67169]: ERROR nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2098.611037] env[67169]: Faults: ['InvalidArgument'] [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Traceback (most recent call last): [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] self.driver.spawn(context, instance, image_meta, [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] self._fetch_image_if_missing(context, vi) [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] image_cache(vi, tmp_image_ds_loc) [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] vm_util.copy_virtual_disk( [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] session._wait_for_task(vmdk_copy_task) [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] return self.wait_for_task(task_ref) [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] return evt.wait() [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] result = hub.switch() [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] return self.greenlet.switch() [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] self.f(*self.args, **self.kw) [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] raise exceptions.translate_fault(task_info.error) [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Faults: ['InvalidArgument'] [ 2098.611037] env[67169]: ERROR nova.compute.manager [instance: aedbfde6-26e1-410d-a311-e2c344f65062] [ 2098.612139] env[67169]: DEBUG nova.compute.utils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2098.613465] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Build of instance aedbfde6-26e1-410d-a311-e2c344f65062 was re-scheduled: A specified parameter was not correct: fileType [ 2098.613465] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2098.613867] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2098.614056] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2098.614245] env[67169]: DEBUG nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2098.614411] env[67169]: DEBUG nova.network.neutron [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2098.903419] env[67169]: DEBUG nova.network.neutron [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2098.917243] env[67169]: INFO nova.compute.manager [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Took 0.30 seconds to deallocate network for instance. [ 2099.006138] env[67169]: INFO nova.scheduler.client.report [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Deleted allocations for instance aedbfde6-26e1-410d-a311-e2c344f65062 [ 2099.028258] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1249c8a0-343c-4729-8066-ecc802e7f711 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "aedbfde6-26e1-410d-a311-e2c344f65062" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 627.870s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.028522] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "aedbfde6-26e1-410d-a311-e2c344f65062" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.870s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2099.028757] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Acquiring lock "aedbfde6-26e1-410d-a311-e2c344f65062-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2099.029030] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "aedbfde6-26e1-410d-a311-e2c344f65062-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2099.029230] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "aedbfde6-26e1-410d-a311-e2c344f65062-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.031170] env[67169]: INFO nova.compute.manager [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Terminating instance [ 2099.034121] env[67169]: DEBUG nova.compute.manager [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2099.034121] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2099.034317] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4b5a42dd-791d-4d19-8dec-6c8cf159202a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.044656] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b31477f8-c364-437c-99e3-5da02f0eb53d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2099.072182] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aedbfde6-26e1-410d-a311-e2c344f65062 could not be found. [ 2099.072441] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2099.072631] env[67169]: INFO nova.compute.manager [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2099.072884] env[67169]: DEBUG oslo.service.loopingcall [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2099.073143] env[67169]: DEBUG nova.compute.manager [-] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2099.073243] env[67169]: DEBUG nova.network.neutron [-] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2099.100535] env[67169]: DEBUG nova.network.neutron [-] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2099.108581] env[67169]: INFO nova.compute.manager [-] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] Took 0.04 seconds to deallocate network for instance. [ 2099.198015] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0ae0cbde-3013-42f9-bb62-e35148824698 tempest-MultipleCreateTestJSON-80480106 tempest-MultipleCreateTestJSON-80480106-project-member] Lock "aedbfde6-26e1-410d-a311-e2c344f65062" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.199172] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "aedbfde6-26e1-410d-a311-e2c344f65062" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 295.512s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2099.199502] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: aedbfde6-26e1-410d-a311-e2c344f65062] During sync_power_state the instance has a pending task (deleting). Skip. [ 2099.199729] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "aedbfde6-26e1-410d-a311-e2c344f65062" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2099.666628] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2105.660285] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2105.660651] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2105.660651] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2105.681642] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2105.681809] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2105.681941] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2105.682078] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2105.682206] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2105.682328] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2105.682448] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2105.682574] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2107.347042] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "b554602b-2aae-4c1b-9385-4bef16a1dc5a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2107.347042] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "b554602b-2aae-4c1b-9385-4bef16a1dc5a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2107.356678] env[67169]: DEBUG nova.compute.manager [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2107.435951] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2107.436237] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2107.437666] env[67169]: INFO nova.compute.claims [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2107.494188] env[67169]: DEBUG nova.scheduler.client.report [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Refreshing inventories for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2107.508115] env[67169]: DEBUG nova.scheduler.client.report [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Updating ProviderTree inventory for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2107.508343] env[67169]: DEBUG nova.compute.provider_tree [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Updating inventory in ProviderTree for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2107.518687] env[67169]: DEBUG nova.scheduler.client.report [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Refreshing aggregate associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, aggregates: None {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2107.534391] env[67169]: DEBUG nova.scheduler.client.report [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Refreshing trait associations for resource provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3, traits: COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67169) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2107.628652] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43e97878-67a5-482f-86da-cce12c740332 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2107.637698] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98ae1c2f-a079-4786-9ae0-3c507741d731 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2107.666307] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2107.666743] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2107.666941] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2107.667633] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01769bda-b31a-4f59-bbbf-92eed3f9dd7f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2107.674426] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d2abbbe-388e-44d1-b3b3-2db55dcf715b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2107.687022] env[67169]: DEBUG nova.compute.provider_tree [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2107.696014] env[67169]: DEBUG nova.scheduler.client.report [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2107.709330] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2107.709862] env[67169]: DEBUG nova.compute.manager [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2107.741766] env[67169]: DEBUG nova.compute.utils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2107.743602] env[67169]: DEBUG nova.compute.manager [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2107.743602] env[67169]: DEBUG nova.network.neutron [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2107.752088] env[67169]: DEBUG nova.compute.manager [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2107.814010] env[67169]: DEBUG nova.compute.manager [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2107.826945] env[67169]: DEBUG nova.policy [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc8f12a2682c4b79aabc2f87ed8678e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a5d2ec974f664a3a9407f7f3e86b4982', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 2107.839039] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2107.839268] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2107.839431] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2107.839615] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2107.839796] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2107.839909] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2107.840130] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2107.840293] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2107.840460] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2107.840620] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2107.840789] env[67169]: DEBUG nova.virt.hardware [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2107.841647] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efd61629-b2c0-4c3c-a6b6-fd63dc9fa706 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2107.849125] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95473ac9-9df1-4063-9cc8-7df4e634f8d9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2108.175862] env[67169]: DEBUG nova.network.neutron [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Successfully created port: 73111163-1b97-4349-8278-c8362ad3be01 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2108.662362] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2108.706840] env[67169]: DEBUG nova.compute.manager [req-d4d1d09b-77e7-4b96-a49d-ff69a5503883 req-eb8fea94-0c29-4aa9-a7ed-4b8fcd6a5434 service nova] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Received event network-vif-plugged-73111163-1b97-4349-8278-c8362ad3be01 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2108.707082] env[67169]: DEBUG oslo_concurrency.lockutils [req-d4d1d09b-77e7-4b96-a49d-ff69a5503883 req-eb8fea94-0c29-4aa9-a7ed-4b8fcd6a5434 service nova] Acquiring lock "b554602b-2aae-4c1b-9385-4bef16a1dc5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2108.707290] env[67169]: DEBUG oslo_concurrency.lockutils [req-d4d1d09b-77e7-4b96-a49d-ff69a5503883 req-eb8fea94-0c29-4aa9-a7ed-4b8fcd6a5434 service nova] Lock "b554602b-2aae-4c1b-9385-4bef16a1dc5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2108.707460] env[67169]: DEBUG oslo_concurrency.lockutils [req-d4d1d09b-77e7-4b96-a49d-ff69a5503883 req-eb8fea94-0c29-4aa9-a7ed-4b8fcd6a5434 service nova] Lock "b554602b-2aae-4c1b-9385-4bef16a1dc5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2108.707687] env[67169]: DEBUG nova.compute.manager [req-d4d1d09b-77e7-4b96-a49d-ff69a5503883 req-eb8fea94-0c29-4aa9-a7ed-4b8fcd6a5434 service nova] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] No waiting events found dispatching network-vif-plugged-73111163-1b97-4349-8278-c8362ad3be01 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2108.707788] env[67169]: WARNING nova.compute.manager [req-d4d1d09b-77e7-4b96-a49d-ff69a5503883 req-eb8fea94-0c29-4aa9-a7ed-4b8fcd6a5434 service nova] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Received unexpected event network-vif-plugged-73111163-1b97-4349-8278-c8362ad3be01 for instance with vm_state building and task_state spawning. [ 2108.839979] env[67169]: DEBUG nova.network.neutron [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Successfully updated port: 73111163-1b97-4349-8278-c8362ad3be01 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2108.854960] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "refresh_cache-b554602b-2aae-4c1b-9385-4bef16a1dc5a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2108.855146] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "refresh_cache-b554602b-2aae-4c1b-9385-4bef16a1dc5a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2108.855309] env[67169]: DEBUG nova.network.neutron [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2108.906545] env[67169]: DEBUG nova.network.neutron [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2109.130501] env[67169]: DEBUG nova.network.neutron [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Updating instance_info_cache with network_info: [{"id": "73111163-1b97-4349-8278-c8362ad3be01", "address": "fa:16:3e:c7:4a:54", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73111163-1b", "ovs_interfaceid": "73111163-1b97-4349-8278-c8362ad3be01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2109.144677] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "refresh_cache-b554602b-2aae-4c1b-9385-4bef16a1dc5a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2109.145067] env[67169]: DEBUG nova.compute.manager [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Instance network_info: |[{"id": "73111163-1b97-4349-8278-c8362ad3be01", "address": "fa:16:3e:c7:4a:54", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73111163-1b", "ovs_interfaceid": "73111163-1b97-4349-8278-c8362ad3be01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2109.145753] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c7:4a:54', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '56398cc0-e39f-410f-8036-8c2a6870e26f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '73111163-1b97-4349-8278-c8362ad3be01', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2109.155060] env[67169]: DEBUG oslo.service.loopingcall [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2109.155060] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2109.155060] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bdee532a-9b21-451b-8566-e801b45044db {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.175480] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2109.175480] env[67169]: value = "task-2819266" [ 2109.175480] env[67169]: _type = "Task" [ 2109.175480] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2109.183571] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819266, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2109.686476] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819266, 'name': CreateVM_Task, 'duration_secs': 0.280418} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2109.686881] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2109.687340] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2109.687521] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2109.687846] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2109.688111] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6d9b8638-b6e9-4ea8-ab49-ea0384e4dd62 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2109.692439] env[67169]: DEBUG oslo_vmware.api [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 2109.692439] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f46c69-3d8e-0f13-5f62-30466300e8e2" [ 2109.692439] env[67169]: _type = "Task" [ 2109.692439] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2109.699780] env[67169]: DEBUG oslo_vmware.api [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52f46c69-3d8e-0f13-5f62-30466300e8e2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2110.202919] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2110.203210] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2110.203448] env[67169]: DEBUG oslo_concurrency.lockutils [None req-264b471c-0703-4a9d-a389-6d414da98ea7 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2110.653588] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2110.658263] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2110.658450] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2110.731612] env[67169]: DEBUG nova.compute.manager [req-efb15ac1-f2c9-43f7-8224-257ef4cdb02b req-4ee0cd2a-a67f-4d70-90c9-592f017f472e service nova] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Received event network-changed-73111163-1b97-4349-8278-c8362ad3be01 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2110.731925] env[67169]: DEBUG nova.compute.manager [req-efb15ac1-f2c9-43f7-8224-257ef4cdb02b req-4ee0cd2a-a67f-4d70-90c9-592f017f472e service nova] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Refreshing instance network info cache due to event network-changed-73111163-1b97-4349-8278-c8362ad3be01. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2110.732121] env[67169]: DEBUG oslo_concurrency.lockutils [req-efb15ac1-f2c9-43f7-8224-257ef4cdb02b req-4ee0cd2a-a67f-4d70-90c9-592f017f472e service nova] Acquiring lock "refresh_cache-b554602b-2aae-4c1b-9385-4bef16a1dc5a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2110.732184] env[67169]: DEBUG oslo_concurrency.lockutils [req-efb15ac1-f2c9-43f7-8224-257ef4cdb02b req-4ee0cd2a-a67f-4d70-90c9-592f017f472e service nova] Acquired lock "refresh_cache-b554602b-2aae-4c1b-9385-4bef16a1dc5a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2110.732331] env[67169]: DEBUG nova.network.neutron [req-efb15ac1-f2c9-43f7-8224-257ef4cdb02b req-4ee0cd2a-a67f-4d70-90c9-592f017f472e service nova] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Refreshing network info cache for port 73111163-1b97-4349-8278-c8362ad3be01 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2110.990797] env[67169]: DEBUG nova.network.neutron [req-efb15ac1-f2c9-43f7-8224-257ef4cdb02b req-4ee0cd2a-a67f-4d70-90c9-592f017f472e service nova] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Updated VIF entry in instance network info cache for port 73111163-1b97-4349-8278-c8362ad3be01. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2110.991169] env[67169]: DEBUG nova.network.neutron [req-efb15ac1-f2c9-43f7-8224-257ef4cdb02b req-4ee0cd2a-a67f-4d70-90c9-592f017f472e service nova] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Updating instance_info_cache with network_info: [{"id": "73111163-1b97-4349-8278-c8362ad3be01", "address": "fa:16:3e:c7:4a:54", "network": {"id": "e1c693aa-d783-44b4-bbb3-c6efc6ccfa95", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1841152718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a5d2ec974f664a3a9407f7f3e86b4982", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "56398cc0-e39f-410f-8036-8c2a6870e26f", "external-id": "nsx-vlan-transportzone-612", "segmentation_id": 612, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73111163-1b", "ovs_interfaceid": "73111163-1b97-4349-8278-c8362ad3be01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2111.000495] env[67169]: DEBUG oslo_concurrency.lockutils [req-efb15ac1-f2c9-43f7-8224-257ef4cdb02b req-4ee0cd2a-a67f-4d70-90c9-592f017f472e service nova] Releasing lock "refresh_cache-b554602b-2aae-4c1b-9385-4bef16a1dc5a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2112.659613] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2112.671156] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2112.671384] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2112.671550] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2112.671709] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2112.672911] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0072362-581d-49cf-a8cb-82ad826bf8a1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.681798] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-170edddb-1711-4f2f-b59f-46fffba68c0c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.697520] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b41e5b31-4f9f-4f70-9bd2-d3771504a289 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.704289] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b70413c5-6370-4575-b9d4-e096e69b54f1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.734705] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181003MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2112.734880] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2112.735100] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2112.806989] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2112.807280] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2112.807425] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2112.807548] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 115d6c00-4259-4e87-aa00-90b576a63535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2112.807663] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2112.807777] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2112.807891] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1ece950d-8b7f-4462-8138-10cbf43149ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2112.808012] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance b554602b-2aae-4c1b-9385-4bef16a1dc5a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2112.808211] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2112.808348] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2112.930904] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a431d974-7bfb-4b10-86df-33ae50f806cf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.939151] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2869e79c-d6a2-407c-adb0-c4ec5c7472ed {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.970554] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e49fbda-db61-42d6-a0c1-faf1291aecbc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.977911] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf40e9f2-8e32-4fee-bbb1-bd6bfa295c8b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.990901] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2112.999641] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2113.014322] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2113.014509] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.279s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2114.009260] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2144.575155] env[67169]: WARNING oslo_vmware.rw_handles [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2144.575155] env[67169]: ERROR oslo_vmware.rw_handles [ 2144.575877] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/d1cb0960-bfd3-4e9a-ab0e-c5e8cb669037/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2144.577873] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2144.578169] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Copying Virtual Disk [datastore2] vmware_temp/d1cb0960-bfd3-4e9a-ab0e-c5e8cb669037/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/d1cb0960-bfd3-4e9a-ab0e-c5e8cb669037/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2144.578470] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f9027093-9e46-42a4-a634-3211c546cbee {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2144.587060] env[67169]: DEBUG oslo_vmware.api [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Waiting for the task: (returnval){ [ 2144.587060] env[67169]: value = "task-2819267" [ 2144.587060] env[67169]: _type = "Task" [ 2144.587060] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2144.595103] env[67169]: DEBUG oslo_vmware.api [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Task: {'id': task-2819267, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2145.096944] env[67169]: DEBUG oslo_vmware.exceptions [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2145.097259] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2145.097825] env[67169]: ERROR nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2145.097825] env[67169]: Faults: ['InvalidArgument'] [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Traceback (most recent call last): [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] yield resources [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] self.driver.spawn(context, instance, image_meta, [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] self._fetch_image_if_missing(context, vi) [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] image_cache(vi, tmp_image_ds_loc) [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] vm_util.copy_virtual_disk( [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] session._wait_for_task(vmdk_copy_task) [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] return self.wait_for_task(task_ref) [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] return evt.wait() [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] result = hub.switch() [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] return self.greenlet.switch() [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] self.f(*self.args, **self.kw) [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] raise exceptions.translate_fault(task_info.error) [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Faults: ['InvalidArgument'] [ 2145.097825] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] [ 2145.098862] env[67169]: INFO nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Terminating instance [ 2145.099696] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2145.099910] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2145.100197] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-71c1957d-7f5f-419c-9510-379d86bb5776 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.102590] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2145.102774] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2145.103498] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bdd4041-4e9d-413c-85f5-1e0545620132 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.110156] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2145.110506] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-06d25bf4-2124-49b9-9d02-77c374d4075c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.112445] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2145.112619] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2145.113560] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8d1b142a-d8a0-4d08-b894-1319eeaefaf8 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.118288] env[67169]: DEBUG oslo_vmware.api [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 2145.118288] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52248467-86f6-a3f6-aa4e-6740d2911a78" [ 2145.118288] env[67169]: _type = "Task" [ 2145.118288] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2145.132426] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2145.132648] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating directory with path [datastore2] vmware_temp/16208284-3c40-4ac3-9492-9d2f79da9aa5/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2145.132854] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4e36a392-b268-4988-a616-f287f48f29cf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.152067] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Created directory with path [datastore2] vmware_temp/16208284-3c40-4ac3-9492-9d2f79da9aa5/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2145.152274] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Fetch image to [datastore2] vmware_temp/16208284-3c40-4ac3-9492-9d2f79da9aa5/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2145.152451] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/16208284-3c40-4ac3-9492-9d2f79da9aa5/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2145.153198] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e98b7149-5198-48db-9200-167d094aac26 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.159823] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e148cdd-4628-4af8-a134-91c3dc010980 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.171335] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9f4629f-dacc-458d-9250-c965bd9111a9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.176103] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2145.176306] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2145.176484] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Deleting the datastore file [datastore2] c05c3ec2-a68d-41b0-a199-fcfc84bb2deb {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2145.177082] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-64fd9e8b-7327-4e15-a141-db2e53d75c7d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.209791] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-281360ab-c430-4ecb-9bb4-edd3405dca0c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.212977] env[67169]: DEBUG oslo_vmware.api [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Waiting for the task: (returnval){ [ 2145.212977] env[67169]: value = "task-2819269" [ 2145.212977] env[67169]: _type = "Task" [ 2145.212977] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2145.218603] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a50d81d3-fc16-4627-bfc3-371458c0b0ee {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.223420] env[67169]: DEBUG oslo_vmware.api [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Task: {'id': task-2819269, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2145.254358] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2145.371180] env[67169]: DEBUG oslo_vmware.rw_handles [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/16208284-3c40-4ac3-9492-9d2f79da9aa5/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2145.429984] env[67169]: DEBUG oslo_vmware.rw_handles [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2145.430192] env[67169]: DEBUG oslo_vmware.rw_handles [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/16208284-3c40-4ac3-9492-9d2f79da9aa5/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2145.722889] env[67169]: DEBUG oslo_vmware.api [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Task: {'id': task-2819269, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066175} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2145.723261] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2145.723338] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2145.723507] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2145.723681] env[67169]: INFO nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2145.725869] env[67169]: DEBUG nova.compute.claims [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2145.726058] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2145.726280] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2145.866041] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cc5305e-81f5-4f1e-b6e9-126dbba5bde7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.873195] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbab1da3-e179-43f7-baea-ce130c97bc29 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.903276] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d3aadba-a1a7-4423-b10c-08ac2f82c08d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.909875] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2a9ad5e-5f61-4278-a331-be8b6b73c7f2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2145.922618] env[67169]: DEBUG nova.compute.provider_tree [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2145.930746] env[67169]: DEBUG nova.scheduler.client.report [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2145.953339] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.227s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2145.954144] env[67169]: ERROR nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2145.954144] env[67169]: Faults: ['InvalidArgument'] [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Traceback (most recent call last): [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] self.driver.spawn(context, instance, image_meta, [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] self._fetch_image_if_missing(context, vi) [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] image_cache(vi, tmp_image_ds_loc) [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] vm_util.copy_virtual_disk( [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] session._wait_for_task(vmdk_copy_task) [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] return self.wait_for_task(task_ref) [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] return evt.wait() [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] result = hub.switch() [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] return self.greenlet.switch() [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] self.f(*self.args, **self.kw) [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] raise exceptions.translate_fault(task_info.error) [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Faults: ['InvalidArgument'] [ 2145.954144] env[67169]: ERROR nova.compute.manager [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] [ 2145.955156] env[67169]: DEBUG nova.compute.utils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2145.957081] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Build of instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb was re-scheduled: A specified parameter was not correct: fileType [ 2145.957081] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2145.957610] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2145.957850] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2145.958107] env[67169]: DEBUG nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2145.958339] env[67169]: DEBUG nova.network.neutron [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2146.311033] env[67169]: DEBUG nova.network.neutron [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2146.326023] env[67169]: INFO nova.compute.manager [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Took 0.37 seconds to deallocate network for instance. [ 2146.432529] env[67169]: INFO nova.scheduler.client.report [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Deleted allocations for instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb [ 2146.457565] env[67169]: DEBUG oslo_concurrency.lockutils [None req-4a94fa74-f862-4bac-971a-1a822064e0a7 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 619.893s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.457700] env[67169]: DEBUG oslo_concurrency.lockutils [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 423.294s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2146.457929] env[67169]: DEBUG oslo_concurrency.lockutils [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Acquiring lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2146.459026] env[67169]: DEBUG oslo_concurrency.lockutils [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2146.459026] env[67169]: DEBUG oslo_concurrency.lockutils [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.462236] env[67169]: INFO nova.compute.manager [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Terminating instance [ 2146.464189] env[67169]: DEBUG nova.compute.manager [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2146.464398] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2146.464667] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-96f8a476-62b0-4fa6-8d4d-b7fa9df1677f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.474283] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a82c805e-04d4-49ff-a29f-3c083f05762d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2146.501422] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c05c3ec2-a68d-41b0-a199-fcfc84bb2deb could not be found. [ 2146.501624] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2146.501800] env[67169]: INFO nova.compute.manager [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2146.502048] env[67169]: DEBUG oslo.service.loopingcall [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2146.502292] env[67169]: DEBUG nova.compute.manager [-] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2146.502388] env[67169]: DEBUG nova.network.neutron [-] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2146.532545] env[67169]: DEBUG nova.network.neutron [-] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2146.563720] env[67169]: INFO nova.compute.manager [-] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] Took 0.06 seconds to deallocate network for instance. [ 2146.650138] env[67169]: DEBUG oslo_concurrency.lockutils [None req-72da6616-256b-4aa9-b649-96703e0180c4 tempest-ServersNegativeTestMultiTenantJSON-34809039 tempest-ServersNegativeTestMultiTenantJSON-34809039-project-member] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.192s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2146.651011] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 342.964s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2146.651227] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: c05c3ec2-a68d-41b0-a199-fcfc84bb2deb] During sync_power_state the instance has a pending task (deleting). Skip. [ 2146.651426] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "c05c3ec2-a68d-41b0-a199-fcfc84bb2deb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2161.658107] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2162.289120] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "7486bbc9-6aa3-4880-9662-b3451b400bf8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2162.289367] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "7486bbc9-6aa3-4880-9662-b3451b400bf8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2162.299536] env[67169]: DEBUG nova.compute.manager [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2162.343377] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2162.343620] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2162.345047] env[67169]: INFO nova.compute.claims [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2162.491357] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36db064e-eec6-4eb8-b12e-6e76aa51de32 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2162.499212] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a35e9d8-ca3d-4dcd-9bd3-deff6ae51fad {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2162.528853] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7fa9aeb-dd66-46ba-b448-982c52414d0b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2162.536062] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cbdb96d-d4fc-4eec-958a-3155dc9232d3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2162.549660] env[67169]: DEBUG nova.compute.provider_tree [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2162.557924] env[67169]: DEBUG nova.scheduler.client.report [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2162.572134] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2162.572648] env[67169]: DEBUG nova.compute.manager [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2162.606937] env[67169]: DEBUG nova.compute.utils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2162.608450] env[67169]: DEBUG nova.compute.manager [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2162.608620] env[67169]: DEBUG nova.network.neutron [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2162.617867] env[67169]: DEBUG nova.compute.manager [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2162.682666] env[67169]: DEBUG nova.compute.manager [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2162.686014] env[67169]: DEBUG nova.policy [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '615c1061ae884c3b91ce1b072249717c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1162bad4f2e4722aed4ff2c657e9dc9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 2162.703274] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2162.703525] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2162.703694] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2162.703881] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2162.704041] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2162.704195] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2162.704400] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2162.704559] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2162.704726] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2162.704881] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2162.705063] env[67169]: DEBUG nova.virt.hardware [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2162.706108] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba013b0f-107c-4204-ab35-f3e5e7ee9a25 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2162.713809] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0318d81d-8b28-42dc-a840-3e1bc5eb6439 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2162.965680] env[67169]: DEBUG nova.network.neutron [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Successfully created port: 881e0d16-58b4-4637-8041-2acf2de98caf {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2163.653464] env[67169]: DEBUG nova.compute.manager [req-5ac40401-d74a-457b-9720-7f328c4bedda req-d5a18cba-5798-4f35-a966-e4eca032424f service nova] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Received event network-vif-plugged-881e0d16-58b4-4637-8041-2acf2de98caf {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2163.653698] env[67169]: DEBUG oslo_concurrency.lockutils [req-5ac40401-d74a-457b-9720-7f328c4bedda req-d5a18cba-5798-4f35-a966-e4eca032424f service nova] Acquiring lock "7486bbc9-6aa3-4880-9662-b3451b400bf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2163.653962] env[67169]: DEBUG oslo_concurrency.lockutils [req-5ac40401-d74a-457b-9720-7f328c4bedda req-d5a18cba-5798-4f35-a966-e4eca032424f service nova] Lock "7486bbc9-6aa3-4880-9662-b3451b400bf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2163.654091] env[67169]: DEBUG oslo_concurrency.lockutils [req-5ac40401-d74a-457b-9720-7f328c4bedda req-d5a18cba-5798-4f35-a966-e4eca032424f service nova] Lock "7486bbc9-6aa3-4880-9662-b3451b400bf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2163.654235] env[67169]: DEBUG nova.compute.manager [req-5ac40401-d74a-457b-9720-7f328c4bedda req-d5a18cba-5798-4f35-a966-e4eca032424f service nova] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] No waiting events found dispatching network-vif-plugged-881e0d16-58b4-4637-8041-2acf2de98caf {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2163.654394] env[67169]: WARNING nova.compute.manager [req-5ac40401-d74a-457b-9720-7f328c4bedda req-d5a18cba-5798-4f35-a966-e4eca032424f service nova] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Received unexpected event network-vif-plugged-881e0d16-58b4-4637-8041-2acf2de98caf for instance with vm_state building and task_state spawning. [ 2163.693715] env[67169]: DEBUG nova.network.neutron [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Successfully updated port: 881e0d16-58b4-4637-8041-2acf2de98caf {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2163.705627] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "refresh_cache-7486bbc9-6aa3-4880-9662-b3451b400bf8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2163.705770] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "refresh_cache-7486bbc9-6aa3-4880-9662-b3451b400bf8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2163.705920] env[67169]: DEBUG nova.network.neutron [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2163.768500] env[67169]: DEBUG nova.network.neutron [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2163.918921] env[67169]: DEBUG nova.network.neutron [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Updating instance_info_cache with network_info: [{"id": "881e0d16-58b4-4637-8041-2acf2de98caf", "address": "fa:16:3e:7f:ee:e5", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap881e0d16-58", "ovs_interfaceid": "881e0d16-58b4-4637-8041-2acf2de98caf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2163.929838] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "refresh_cache-7486bbc9-6aa3-4880-9662-b3451b400bf8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2163.930123] env[67169]: DEBUG nova.compute.manager [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Instance network_info: |[{"id": "881e0d16-58b4-4637-8041-2acf2de98caf", "address": "fa:16:3e:7f:ee:e5", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap881e0d16-58", "ovs_interfaceid": "881e0d16-58b4-4637-8041-2acf2de98caf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2163.930492] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7f:ee:e5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '24210a23-d8ac-4f4f-84ac-dc0636de9a72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '881e0d16-58b4-4637-8041-2acf2de98caf', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2163.937969] env[67169]: DEBUG oslo.service.loopingcall [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2163.938416] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2163.938645] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-74b1e195-2347-4fee-b457-42f99d8d2be9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.959173] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2163.959173] env[67169]: value = "task-2819270" [ 2163.959173] env[67169]: _type = "Task" [ 2163.959173] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2163.967227] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819270, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2164.469895] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819270, 'name': CreateVM_Task, 'duration_secs': 0.297972} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2164.470079] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2164.470777] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2164.470945] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2164.471298] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2164.471550] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6c06f50c-5490-49e1-b568-1c905fc980b5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2164.475788] env[67169]: DEBUG oslo_vmware.api [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 2164.475788] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]524541ed-7656-dbb8-81ec-33d31a9dfcff" [ 2164.475788] env[67169]: _type = "Task" [ 2164.475788] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2164.484302] env[67169]: DEBUG oslo_vmware.api [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]524541ed-7656-dbb8-81ec-33d31a9dfcff, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2164.986241] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2164.986607] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2164.986659] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26f56048-8678-4a58-9c6c-8689996827a2 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2165.659989] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2165.660218] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2165.660347] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2165.681417] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2165.681655] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2165.681655] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2165.681781] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2165.681887] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2165.682019] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2165.682148] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2165.682268] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2165.682428] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2165.724715] env[67169]: DEBUG nova.compute.manager [req-e9b80fe6-ef7c-40c7-ac15-63849531a055 req-db48b7d8-3656-44fd-b2e3-57579a273eb6 service nova] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Received event network-changed-881e0d16-58b4-4637-8041-2acf2de98caf {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2165.725144] env[67169]: DEBUG nova.compute.manager [req-e9b80fe6-ef7c-40c7-ac15-63849531a055 req-db48b7d8-3656-44fd-b2e3-57579a273eb6 service nova] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Refreshing instance network info cache due to event network-changed-881e0d16-58b4-4637-8041-2acf2de98caf. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2165.725495] env[67169]: DEBUG oslo_concurrency.lockutils [req-e9b80fe6-ef7c-40c7-ac15-63849531a055 req-db48b7d8-3656-44fd-b2e3-57579a273eb6 service nova] Acquiring lock "refresh_cache-7486bbc9-6aa3-4880-9662-b3451b400bf8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2165.725651] env[67169]: DEBUG oslo_concurrency.lockutils [req-e9b80fe6-ef7c-40c7-ac15-63849531a055 req-db48b7d8-3656-44fd-b2e3-57579a273eb6 service nova] Acquired lock "refresh_cache-7486bbc9-6aa3-4880-9662-b3451b400bf8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2165.725818] env[67169]: DEBUG nova.network.neutron [req-e9b80fe6-ef7c-40c7-ac15-63849531a055 req-db48b7d8-3656-44fd-b2e3-57579a273eb6 service nova] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Refreshing network info cache for port 881e0d16-58b4-4637-8041-2acf2de98caf {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2166.144501] env[67169]: DEBUG nova.network.neutron [req-e9b80fe6-ef7c-40c7-ac15-63849531a055 req-db48b7d8-3656-44fd-b2e3-57579a273eb6 service nova] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Updated VIF entry in instance network info cache for port 881e0d16-58b4-4637-8041-2acf2de98caf. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2166.144906] env[67169]: DEBUG nova.network.neutron [req-e9b80fe6-ef7c-40c7-ac15-63849531a055 req-db48b7d8-3656-44fd-b2e3-57579a273eb6 service nova] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Updating instance_info_cache with network_info: [{"id": "881e0d16-58b4-4637-8041-2acf2de98caf", "address": "fa:16:3e:7f:ee:e5", "network": {"id": "05c41aa5-dcb7-46fa-ba23-2f4b7685b6a9", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1740060268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1162bad4f2e4722aed4ff2c657e9dc9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap881e0d16-58", "ovs_interfaceid": "881e0d16-58b4-4637-8041-2acf2de98caf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2166.153984] env[67169]: DEBUG oslo_concurrency.lockutils [req-e9b80fe6-ef7c-40c7-ac15-63849531a055 req-db48b7d8-3656-44fd-b2e3-57579a273eb6 service nova] Releasing lock "refresh_cache-7486bbc9-6aa3-4880-9662-b3451b400bf8" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2167.263033] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7d6ef48b-29f6-45ff-bf22-07febd78659d tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquiring lock "1ece950d-8b7f-4462-8138-10cbf43149ee" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2167.658913] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2167.658913] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2169.659695] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2169.660143] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2171.653896] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2171.658517] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2171.658705] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2172.659173] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2172.671065] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2172.671300] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2172.671471] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2172.671630] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2172.672826] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8fb268b-5003-462d-86b0-feb2f3b2e398 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.681746] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1af0f0f8-b265-4f48-a372-c46ee7c65635 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.695728] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fa5ac21-9248-4ed8-9992-534a87f5776c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.702509] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0622eb3-db70-458c-872e-3f7d23e1fc25 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.733074] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181037MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2172.733339] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2172.733402] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2172.798862] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 9435574d-2128-4b20-ba92-ee2aba37d33b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2172.799038] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2172.799170] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 115d6c00-4259-4e87-aa00-90b576a63535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2172.799294] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2172.799414] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2172.799532] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1ece950d-8b7f-4462-8138-10cbf43149ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2172.799649] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance b554602b-2aae-4c1b-9385-4bef16a1dc5a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2172.799764] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7486bbc9-6aa3-4880-9662-b3451b400bf8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2172.800040] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2172.800103] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2172.906233] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5a991ac-154e-4e26-bef7-7dea5b9824c1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.914520] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0956faa9-aa49-415c-b08a-9bba37cb852c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.945926] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ee474ce-b3d2-40bd-89e3-fc9adb779949 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.953134] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f18b029-dcae-4767-b800-10f34a454954 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2172.967421] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2172.975767] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2172.991581] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2172.991770] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2192.175530] env[67169]: WARNING oslo_vmware.rw_handles [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2192.175530] env[67169]: ERROR oslo_vmware.rw_handles [ 2192.176480] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/16208284-3c40-4ac3-9492-9d2f79da9aa5/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2192.178040] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2192.178296] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Copying Virtual Disk [datastore2] vmware_temp/16208284-3c40-4ac3-9492-9d2f79da9aa5/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/16208284-3c40-4ac3-9492-9d2f79da9aa5/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2192.178588] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ff181a38-3af7-4a3f-9366-213579bfaa50 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.187330] env[67169]: DEBUG oslo_vmware.api [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 2192.187330] env[67169]: value = "task-2819271" [ 2192.187330] env[67169]: _type = "Task" [ 2192.187330] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2192.195276] env[67169]: DEBUG oslo_vmware.api [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': task-2819271, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2192.698036] env[67169]: DEBUG oslo_vmware.exceptions [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2192.698329] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2192.698868] env[67169]: ERROR nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2192.698868] env[67169]: Faults: ['InvalidArgument'] [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Traceback (most recent call last): [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] yield resources [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] self.driver.spawn(context, instance, image_meta, [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] self._fetch_image_if_missing(context, vi) [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] image_cache(vi, tmp_image_ds_loc) [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] vm_util.copy_virtual_disk( [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] session._wait_for_task(vmdk_copy_task) [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] return self.wait_for_task(task_ref) [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] return evt.wait() [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] result = hub.switch() [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] return self.greenlet.switch() [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] self.f(*self.args, **self.kw) [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] raise exceptions.translate_fault(task_info.error) [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Faults: ['InvalidArgument'] [ 2192.698868] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] [ 2192.699999] env[67169]: INFO nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Terminating instance [ 2192.700684] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2192.700892] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2192.701150] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8512563d-87a7-41f3-adfe-92828eb80b61 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.703273] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2192.703463] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2192.704200] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f9535f6-749c-4133-abbc-457a1cbb2cf9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.710855] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2192.711796] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f3adf99d-25bc-4daa-b68f-0965180cea1c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.713162] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2192.713333] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2192.714022] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7754ad7b-97e5-4cc5-8352-f9838299e583 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.719172] env[67169]: DEBUG oslo_vmware.api [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Waiting for the task: (returnval){ [ 2192.719172] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ef1ceb-f300-ad23-85b2-27bd90ee010d" [ 2192.719172] env[67169]: _type = "Task" [ 2192.719172] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2192.726090] env[67169]: DEBUG oslo_vmware.api [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52ef1ceb-f300-ad23-85b2-27bd90ee010d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2192.790662] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2192.790903] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2192.791083] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Deleting the datastore file [datastore2] 9435574d-2128-4b20-ba92-ee2aba37d33b {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2192.791363] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f5908cb5-1baf-41aa-8f08-a9587782240e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2192.797436] env[67169]: DEBUG oslo_vmware.api [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for the task: (returnval){ [ 2192.797436] env[67169]: value = "task-2819273" [ 2192.797436] env[67169]: _type = "Task" [ 2192.797436] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2192.804849] env[67169]: DEBUG oslo_vmware.api [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': task-2819273, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2193.229649] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2193.230077] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Creating directory with path [datastore2] vmware_temp/d78b50f0-453a-4c27-b981-69e818930869/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2193.230453] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-867c3d84-72e1-4768-8c84-35e6a42e2641 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.241489] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Created directory with path [datastore2] vmware_temp/d78b50f0-453a-4c27-b981-69e818930869/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2193.241669] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Fetch image to [datastore2] vmware_temp/d78b50f0-453a-4c27-b981-69e818930869/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2193.241840] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/d78b50f0-453a-4c27-b981-69e818930869/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2193.242531] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d825a28-3a82-470e-b095-33fad7a1c112 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.248590] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f06f688a-9e58-4911-8803-2e74e16b956e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.257157] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13ea88e7-dede-4bda-b9ad-b6e982714af4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.288823] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48bf15e4-9752-47c5-b15c-a0fe7ec0fdbf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.293427] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c571696e-3210-4e88-9f6b-7782ba636f51 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.305729] env[67169]: DEBUG oslo_vmware.api [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Task: {'id': task-2819273, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074246} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2193.305958] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2193.307928] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2193.307928] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2193.307928] env[67169]: INFO nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2193.308962] env[67169]: DEBUG nova.compute.claims [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2193.309155] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2193.309445] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2193.314783] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2193.369442] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d78b50f0-453a-4c27-b981-69e818930869/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2193.434847] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2193.435439] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d78b50f0-453a-4c27-b981-69e818930869/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2193.513048] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b7b472a-567a-49fd-9133-411e79287d86 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.520485] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31eda958-a22c-47d2-a633-5d1b7466e2ca {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.551014] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5975a4f8-a116-4aa4-9cf2-a58658486bb1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.557628] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67bafc5b-5b58-40cc-89ea-484062198d66 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2193.570876] env[67169]: DEBUG nova.compute.provider_tree [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2193.580056] env[67169]: DEBUG nova.scheduler.client.report [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2193.592584] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.283s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2193.593133] env[67169]: ERROR nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2193.593133] env[67169]: Faults: ['InvalidArgument'] [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Traceback (most recent call last): [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] self.driver.spawn(context, instance, image_meta, [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] self._fetch_image_if_missing(context, vi) [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] image_cache(vi, tmp_image_ds_loc) [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] vm_util.copy_virtual_disk( [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] session._wait_for_task(vmdk_copy_task) [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] return self.wait_for_task(task_ref) [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] return evt.wait() [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] result = hub.switch() [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] return self.greenlet.switch() [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] self.f(*self.args, **self.kw) [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] raise exceptions.translate_fault(task_info.error) [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Faults: ['InvalidArgument'] [ 2193.593133] env[67169]: ERROR nova.compute.manager [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] [ 2193.594063] env[67169]: DEBUG nova.compute.utils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2193.595196] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Build of instance 9435574d-2128-4b20-ba92-ee2aba37d33b was re-scheduled: A specified parameter was not correct: fileType [ 2193.595196] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2193.595568] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2193.595738] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2193.595909] env[67169]: DEBUG nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2193.596082] env[67169]: DEBUG nova.network.neutron [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2194.137666] env[67169]: DEBUG nova.network.neutron [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2194.148676] env[67169]: INFO nova.compute.manager [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Took 0.55 seconds to deallocate network for instance. [ 2194.249278] env[67169]: INFO nova.scheduler.client.report [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Deleted allocations for instance 9435574d-2128-4b20-ba92-ee2aba37d33b [ 2194.270617] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0a77388e-8384-4a55-b3b4-6cd525a7172b tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 620.065s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2194.270617] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.045s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2194.270827] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Acquiring lock "9435574d-2128-4b20-ba92-ee2aba37d33b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2194.270890] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2194.271075] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2194.272910] env[67169]: INFO nova.compute.manager [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Terminating instance [ 2194.274840] env[67169]: DEBUG nova.compute.manager [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2194.274940] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2194.275385] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-85060ed8-8efe-41a5-9356-89a35b6755af {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.284028] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2894800-32c2-4c59-9172-21efa3a1b511 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.311378] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9435574d-2128-4b20-ba92-ee2aba37d33b could not be found. [ 2194.311572] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2194.311746] env[67169]: INFO nova.compute.manager [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2194.311978] env[67169]: DEBUG oslo.service.loopingcall [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2194.312202] env[67169]: DEBUG nova.compute.manager [-] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2194.312299] env[67169]: DEBUG nova.network.neutron [-] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2194.340272] env[67169]: DEBUG nova.network.neutron [-] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2194.347948] env[67169]: INFO nova.compute.manager [-] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] Took 0.04 seconds to deallocate network for instance. [ 2194.430157] env[67169]: DEBUG oslo_concurrency.lockutils [None req-5368871f-3c60-4f1a-9c77-d999af4747d6 tempest-AttachInterfacesTestJSON-1789409843 tempest-AttachInterfacesTestJSON-1789409843-project-member] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2194.431278] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 390.744s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2194.431278] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 9435574d-2128-4b20-ba92-ee2aba37d33b] During sync_power_state the instance has a pending task (deleting). Skip. [ 2194.431278] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "9435574d-2128-4b20-ba92-ee2aba37d33b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2211.867489] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Acquiring lock "0dcfb66e-080f-4b51-9f3d-1a29aea0af4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2211.867779] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Lock "0dcfb66e-080f-4b51-9f3d-1a29aea0af4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2211.882600] env[67169]: DEBUG nova.compute.manager [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Starting instance... {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2211.938277] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2211.938540] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2211.940353] env[67169]: INFO nova.compute.claims [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2212.097258] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4aa7e793-f214-4ff8-a5d1-2fd1713ff750 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2212.104737] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-509d1247-7c29-4581-9d94-cf61443cc93c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2212.136038] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cbf21b4-c661-499d-a2a5-5300abbe40fa {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2212.142847] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1c18938-ff70-4d85-9b6e-b9fdafb206da {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2212.156850] env[67169]: DEBUG nova.compute.provider_tree [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2212.165368] env[67169]: DEBUG nova.scheduler.client.report [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2212.179131] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2212.179668] env[67169]: DEBUG nova.compute.manager [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Start building networks asynchronously for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2212.219894] env[67169]: DEBUG nova.compute.utils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Using /dev/sd instead of None {{(pid=67169) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2212.221102] env[67169]: DEBUG nova.compute.manager [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Allocating IP information in the background. {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2212.221276] env[67169]: DEBUG nova.network.neutron [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] allocate_for_instance() {{(pid=67169) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2212.231021] env[67169]: DEBUG nova.compute.manager [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Start building block device mappings for instance. {{(pid=67169) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2212.277792] env[67169]: DEBUG nova.policy [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fbd13f2cb06f49f2bf40d9f240d9d308', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd2bb49d543c3408383e541d3e4af0b8f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67169) authorize /opt/stack/nova/nova/policy.py:203}} [ 2212.301661] env[67169]: DEBUG nova.compute.manager [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Start spawning the instance on the hypervisor. {{(pid=67169) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2212.330586] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-06T19:55:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-06T19:54:55Z,direct_url=,disk_format='vmdk',id=285931c9-8b83-4997-8c4d-6a79005e36ba,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c31f6504bb73492890b262ff43fdf9bc',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-06T19:54:55Z,virtual_size=,visibility=), allow threads: False {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2212.330859] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Flavor limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2212.331040] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Image limits 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2212.331232] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Flavor pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2212.331376] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Image pref 0:0:0 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2212.331518] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67169) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2212.331720] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2212.331877] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2212.332058] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Got 1 possible topologies {{(pid=67169) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2212.332226] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2212.332397] env[67169]: DEBUG nova.virt.hardware [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67169) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2212.333290] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f94b60e6-db21-49cb-9cf5-ffad1942f76e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2212.342178] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f81eea84-8c73-4070-b64d-f5f3c7c36955 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2212.802800] env[67169]: DEBUG nova.network.neutron [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Successfully created port: 6e949afa-8ae3-482f-a681-00e02c8112e9 {{(pid=67169) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2213.460043] env[67169]: DEBUG nova.compute.manager [req-460cb171-b828-4eed-9c85-4e8edac32dbb req-839d9783-432d-4aff-8a22-338d821e45c5 service nova] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Received event network-vif-plugged-6e949afa-8ae3-482f-a681-00e02c8112e9 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2213.460532] env[67169]: DEBUG oslo_concurrency.lockutils [req-460cb171-b828-4eed-9c85-4e8edac32dbb req-839d9783-432d-4aff-8a22-338d821e45c5 service nova] Acquiring lock "0dcfb66e-080f-4b51-9f3d-1a29aea0af4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2213.460532] env[67169]: DEBUG oslo_concurrency.lockutils [req-460cb171-b828-4eed-9c85-4e8edac32dbb req-839d9783-432d-4aff-8a22-338d821e45c5 service nova] Lock "0dcfb66e-080f-4b51-9f3d-1a29aea0af4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2213.460652] env[67169]: DEBUG oslo_concurrency.lockutils [req-460cb171-b828-4eed-9c85-4e8edac32dbb req-839d9783-432d-4aff-8a22-338d821e45c5 service nova] Lock "0dcfb66e-080f-4b51-9f3d-1a29aea0af4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2213.460791] env[67169]: DEBUG nova.compute.manager [req-460cb171-b828-4eed-9c85-4e8edac32dbb req-839d9783-432d-4aff-8a22-338d821e45c5 service nova] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] No waiting events found dispatching network-vif-plugged-6e949afa-8ae3-482f-a681-00e02c8112e9 {{(pid=67169) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2213.460954] env[67169]: WARNING nova.compute.manager [req-460cb171-b828-4eed-9c85-4e8edac32dbb req-839d9783-432d-4aff-8a22-338d821e45c5 service nova] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Received unexpected event network-vif-plugged-6e949afa-8ae3-482f-a681-00e02c8112e9 for instance with vm_state building and task_state spawning. [ 2213.512096] env[67169]: DEBUG nova.network.neutron [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Successfully updated port: 6e949afa-8ae3-482f-a681-00e02c8112e9 {{(pid=67169) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2213.521929] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Acquiring lock "refresh_cache-0dcfb66e-080f-4b51-9f3d-1a29aea0af4a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2213.522114] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Acquired lock "refresh_cache-0dcfb66e-080f-4b51-9f3d-1a29aea0af4a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2213.522276] env[67169]: DEBUG nova.network.neutron [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Building network info cache for instance {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2213.558803] env[67169]: DEBUG nova.network.neutron [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Instance cache missing network info. {{(pid=67169) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2213.707952] env[67169]: DEBUG nova.network.neutron [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Updating instance_info_cache with network_info: [{"id": "6e949afa-8ae3-482f-a681-00e02c8112e9", "address": "fa:16:3e:98:7e:b5", "network": {"id": "08ec08f8-9e69-4040-a9b8-ae4bb115e94b", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-111845408-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d2bb49d543c3408383e541d3e4af0b8f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba07329-1d3e-4ba8-8774-d029262318c4", "external-id": "nsx-vlan-transportzone-534", "segmentation_id": 534, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e949afa-8a", "ovs_interfaceid": "6e949afa-8ae3-482f-a681-00e02c8112e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2213.718120] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Releasing lock "refresh_cache-0dcfb66e-080f-4b51-9f3d-1a29aea0af4a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2213.718406] env[67169]: DEBUG nova.compute.manager [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Instance network_info: |[{"id": "6e949afa-8ae3-482f-a681-00e02c8112e9", "address": "fa:16:3e:98:7e:b5", "network": {"id": "08ec08f8-9e69-4040-a9b8-ae4bb115e94b", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-111845408-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d2bb49d543c3408383e541d3e4af0b8f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba07329-1d3e-4ba8-8774-d029262318c4", "external-id": "nsx-vlan-transportzone-534", "segmentation_id": 534, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e949afa-8a", "ovs_interfaceid": "6e949afa-8ae3-482f-a681-00e02c8112e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67169) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2213.719042] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:98:7e:b5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5ba07329-1d3e-4ba8-8774-d029262318c4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6e949afa-8ae3-482f-a681-00e02c8112e9', 'vif_model': 'vmxnet3'}] {{(pid=67169) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2213.726482] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Creating folder: Project (d2bb49d543c3408383e541d3e4af0b8f). Parent ref: group-v566843. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2213.726948] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2261ee80-7b79-4a4d-a370-1d9dee73da80 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.737962] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Created folder: Project (d2bb49d543c3408383e541d3e4af0b8f) in parent group-v566843. [ 2213.738159] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Creating folder: Instances. Parent ref: group-v566953. {{(pid=67169) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2213.738384] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1242d944-71ce-493b-abd4-b12a2c34d5b5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.746970] env[67169]: INFO nova.virt.vmwareapi.vm_util [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Created folder: Instances in parent group-v566953. [ 2213.747314] env[67169]: DEBUG oslo.service.loopingcall [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2213.747573] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Creating VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2213.747799] env[67169]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9aff4e0c-6985-46e2-bc57-9815c9f37c8b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.766014] env[67169]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2213.766014] env[67169]: value = "task-2819276" [ 2213.766014] env[67169]: _type = "Task" [ 2213.766014] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2213.772793] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819276, 'name': CreateVM_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2214.276563] env[67169]: DEBUG oslo_vmware.api [-] Task: {'id': task-2819276, 'name': CreateVM_Task, 'duration_secs': 0.282778} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2214.276757] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Created VM on the ESX host {{(pid=67169) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2214.277449] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2214.277619] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2214.277932] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2214.278207] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-56698b7f-05ce-417f-9551-5d7682e39679 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2214.282520] env[67169]: DEBUG oslo_vmware.api [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Waiting for the task: (returnval){ [ 2214.282520] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52c4823b-ceaa-b2f1-3a53-c43856f10f02" [ 2214.282520] env[67169]: _type = "Task" [ 2214.282520] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2214.289769] env[67169]: DEBUG oslo_vmware.api [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52c4823b-ceaa-b2f1-3a53-c43856f10f02, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2214.794555] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2214.794952] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Processing image 285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2214.795043] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d38a8ef2-3e92-4000-be04-bb6044e64705 tempest-ServersV294TestFqdnHostnames-1376888705 tempest-ServersV294TestFqdnHostnames-1376888705-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2215.485344] env[67169]: DEBUG nova.compute.manager [req-d66fec1d-5687-4c4e-a350-1a4417d69a56 req-0ec6cba7-4fd2-481f-9aa3-24dfdc6a7a8a service nova] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Received event network-changed-6e949afa-8ae3-482f-a681-00e02c8112e9 {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2215.485533] env[67169]: DEBUG nova.compute.manager [req-d66fec1d-5687-4c4e-a350-1a4417d69a56 req-0ec6cba7-4fd2-481f-9aa3-24dfdc6a7a8a service nova] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Refreshing instance network info cache due to event network-changed-6e949afa-8ae3-482f-a681-00e02c8112e9. {{(pid=67169) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2215.485741] env[67169]: DEBUG oslo_concurrency.lockutils [req-d66fec1d-5687-4c4e-a350-1a4417d69a56 req-0ec6cba7-4fd2-481f-9aa3-24dfdc6a7a8a service nova] Acquiring lock "refresh_cache-0dcfb66e-080f-4b51-9f3d-1a29aea0af4a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2215.485888] env[67169]: DEBUG oslo_concurrency.lockutils [req-d66fec1d-5687-4c4e-a350-1a4417d69a56 req-0ec6cba7-4fd2-481f-9aa3-24dfdc6a7a8a service nova] Acquired lock "refresh_cache-0dcfb66e-080f-4b51-9f3d-1a29aea0af4a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2215.486066] env[67169]: DEBUG nova.network.neutron [req-d66fec1d-5687-4c4e-a350-1a4417d69a56 req-0ec6cba7-4fd2-481f-9aa3-24dfdc6a7a8a service nova] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Refreshing network info cache for port 6e949afa-8ae3-482f-a681-00e02c8112e9 {{(pid=67169) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2215.713802] env[67169]: DEBUG nova.network.neutron [req-d66fec1d-5687-4c4e-a350-1a4417d69a56 req-0ec6cba7-4fd2-481f-9aa3-24dfdc6a7a8a service nova] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Updated VIF entry in instance network info cache for port 6e949afa-8ae3-482f-a681-00e02c8112e9. {{(pid=67169) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2215.714187] env[67169]: DEBUG nova.network.neutron [req-d66fec1d-5687-4c4e-a350-1a4417d69a56 req-0ec6cba7-4fd2-481f-9aa3-24dfdc6a7a8a service nova] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Updating instance_info_cache with network_info: [{"id": "6e949afa-8ae3-482f-a681-00e02c8112e9", "address": "fa:16:3e:98:7e:b5", "network": {"id": "08ec08f8-9e69-4040-a9b8-ae4bb115e94b", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-111845408-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d2bb49d543c3408383e541d3e4af0b8f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba07329-1d3e-4ba8-8774-d029262318c4", "external-id": "nsx-vlan-transportzone-534", "segmentation_id": 534, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e949afa-8a", "ovs_interfaceid": "6e949afa-8ae3-482f-a681-00e02c8112e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2215.723268] env[67169]: DEBUG oslo_concurrency.lockutils [req-d66fec1d-5687-4c4e-a350-1a4417d69a56 req-0ec6cba7-4fd2-481f-9aa3-24dfdc6a7a8a service nova] Releasing lock "refresh_cache-0dcfb66e-080f-4b51-9f3d-1a29aea0af4a" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2223.991333] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2227.658574] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2227.658823] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2227.658889] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2227.679734] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2227.679900] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2227.680066] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2227.680197] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2227.680323] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2227.680444] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2227.680563] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2227.680681] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2227.680833] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2228.659331] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2228.659684] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2231.660733] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2231.661126] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2231.661202] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2232.654470] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2232.658206] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2232.671434] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2232.671739] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2232.671848] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2232.671959] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2232.673167] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a61c1df-58c6-46b9-b11b-1053044f4424 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2232.682173] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8856d857-894b-4ef3-a1fe-3e4e37187886 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2232.696198] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5eb16a9-6cc0-484c-8508-2d5df4524ddf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2232.702768] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91ff71a7-6f49-414a-b843-39860c9627fc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2232.733025] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181015MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2232.733268] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2232.733407] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2232.802073] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2232.802246] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 115d6c00-4259-4e87-aa00-90b576a63535 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2232.802376] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2232.802498] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2232.802616] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1ece950d-8b7f-4462-8138-10cbf43149ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2232.802753] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance b554602b-2aae-4c1b-9385-4bef16a1dc5a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2232.802886] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7486bbc9-6aa3-4880-9662-b3451b400bf8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2232.803043] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2232.803249] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2232.803389] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2232.910545] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-777bd350-2534-46bd-9394-cfd61eac1641 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2232.918508] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a1aa131-e140-457a-8c59-e812c9aacd9d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2232.948372] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0752718a-3383-4051-bdcb-bb836017afd9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2232.955705] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c651422-7b4b-4dd6-99f7-4710c1cf593d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2232.969096] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2232.977467] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2232.993389] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2232.993570] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2233.994596] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2237.654401] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2239.228459] env[67169]: WARNING oslo_vmware.rw_handles [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2239.228459] env[67169]: ERROR oslo_vmware.rw_handles [ 2239.229062] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/d78b50f0-453a-4c27-b981-69e818930869/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2239.231164] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2239.231421] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Copying Virtual Disk [datastore2] vmware_temp/d78b50f0-453a-4c27-b981-69e818930869/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/d78b50f0-453a-4c27-b981-69e818930869/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2239.231705] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d4724b54-5603-43fc-8a72-0d43e884ad7e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.240389] env[67169]: DEBUG oslo_vmware.api [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Waiting for the task: (returnval){ [ 2239.240389] env[67169]: value = "task-2819277" [ 2239.240389] env[67169]: _type = "Task" [ 2239.240389] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2239.247875] env[67169]: DEBUG oslo_vmware.api [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Task: {'id': task-2819277, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2239.750374] env[67169]: DEBUG oslo_vmware.exceptions [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2239.750643] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2239.751228] env[67169]: ERROR nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2239.751228] env[67169]: Faults: ['InvalidArgument'] [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Traceback (most recent call last): [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] yield resources [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] self.driver.spawn(context, instance, image_meta, [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] self._fetch_image_if_missing(context, vi) [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] image_cache(vi, tmp_image_ds_loc) [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] vm_util.copy_virtual_disk( [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] session._wait_for_task(vmdk_copy_task) [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] return self.wait_for_task(task_ref) [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] return evt.wait() [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] result = hub.switch() [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] return self.greenlet.switch() [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] self.f(*self.args, **self.kw) [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] raise exceptions.translate_fault(task_info.error) [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Faults: ['InvalidArgument'] [ 2239.751228] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] [ 2239.752069] env[67169]: INFO nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Terminating instance [ 2239.753112] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2239.753341] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2239.753580] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-68906793-a75a-4bda-bf29-cd5840c7227f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.755722] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2239.755908] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2239.756618] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a054dcb9-e5c0-452c-bb08-48f703c0e7ea {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.763535] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2239.763778] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-662355ad-2093-4200-ac7d-653a04b82110 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.765774] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2239.765917] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2239.766838] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c8ac22c5-c34b-4bfc-8ae2-76d851f04b2d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.771232] env[67169]: DEBUG oslo_vmware.api [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 2239.771232] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52accebc-f7c9-dd71-ecdb-3e1524acdaf8" [ 2239.771232] env[67169]: _type = "Task" [ 2239.771232] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2239.778745] env[67169]: DEBUG oslo_vmware.api [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52accebc-f7c9-dd71-ecdb-3e1524acdaf8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2239.835270] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2239.835501] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2239.835680] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Deleting the datastore file [datastore2] 6663b166-0d24-45a7-8c2c-e4e68dbe0005 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2239.835947] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cc1ada22-9e65-4b3a-89ad-8ff98d05565f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.842673] env[67169]: DEBUG oslo_vmware.api [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Waiting for the task: (returnval){ [ 2239.842673] env[67169]: value = "task-2819279" [ 2239.842673] env[67169]: _type = "Task" [ 2239.842673] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2239.850296] env[67169]: DEBUG oslo_vmware.api [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Task: {'id': task-2819279, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2240.281332] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2240.281638] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating directory with path [datastore2] vmware_temp/df91c7ab-6f98-4b81-a0c9-df82910147db/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2240.281812] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c07af981-2689-4834-ad9a-adb54174e732 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.292562] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Created directory with path [datastore2] vmware_temp/df91c7ab-6f98-4b81-a0c9-df82910147db/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2240.292779] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Fetch image to [datastore2] vmware_temp/df91c7ab-6f98-4b81-a0c9-df82910147db/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2240.292962] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/df91c7ab-6f98-4b81-a0c9-df82910147db/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2240.293691] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc3d8806-f3a8-4a4e-a3d8-43518b0cf679 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.299981] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acbae08e-9a81-4f35-913c-c06b7447616e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.309024] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0edc8b45-acba-4b0a-a0d6-d7a50d22efaf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.339942] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80eea475-8bda-4403-819d-84d4abee7445 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.347895] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ea68bf41-2322-404c-93dd-0915b1008a3d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.352180] env[67169]: DEBUG oslo_vmware.api [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Task: {'id': task-2819279, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078117} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2240.352718] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2240.352897] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2240.353095] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2240.353282] env[67169]: INFO nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2240.355377] env[67169]: DEBUG nova.compute.claims [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2240.355533] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2240.355770] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2240.371225] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2240.422682] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/df91c7ab-6f98-4b81-a0c9-df82910147db/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2240.482346] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2240.482535] env[67169]: DEBUG oslo_vmware.rw_handles [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/df91c7ab-6f98-4b81-a0c9-df82910147db/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2240.539471] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9858ea33-863a-40f7-842e-adc6979e9929 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.547209] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89595273-e8ee-4f5a-8e09-9d7dcc662700 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.576335] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-628d7871-1506-40e2-ad50-19e374d129bd {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.583205] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29692416-a5ce-4558-b2a9-cd5c1861babf {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.595846] env[67169]: DEBUG nova.compute.provider_tree [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2240.604021] env[67169]: DEBUG nova.scheduler.client.report [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2240.616310] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.261s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2240.616843] env[67169]: ERROR nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2240.616843] env[67169]: Faults: ['InvalidArgument'] [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Traceback (most recent call last): [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] self.driver.spawn(context, instance, image_meta, [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] self._fetch_image_if_missing(context, vi) [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] image_cache(vi, tmp_image_ds_loc) [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] vm_util.copy_virtual_disk( [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] session._wait_for_task(vmdk_copy_task) [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] return self.wait_for_task(task_ref) [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] return evt.wait() [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] result = hub.switch() [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] return self.greenlet.switch() [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] self.f(*self.args, **self.kw) [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] raise exceptions.translate_fault(task_info.error) [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Faults: ['InvalidArgument'] [ 2240.616843] env[67169]: ERROR nova.compute.manager [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] [ 2240.617704] env[67169]: DEBUG nova.compute.utils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2240.618789] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Build of instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 was re-scheduled: A specified parameter was not correct: fileType [ 2240.618789] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2240.619169] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2240.619343] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2240.619511] env[67169]: DEBUG nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2240.619670] env[67169]: DEBUG nova.network.neutron [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2241.165325] env[67169]: DEBUG nova.network.neutron [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2241.181340] env[67169]: INFO nova.compute.manager [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Took 0.56 seconds to deallocate network for instance. [ 2241.285633] env[67169]: INFO nova.scheduler.client.report [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Deleted allocations for instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 [ 2241.308956] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e846be1-02cb-41a5-811a-2b9f52cbbee0 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 590.268s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.308956] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 437.621s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.308956] env[67169]: INFO nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] During sync_power_state the instance has a pending task (spawning). Skip. [ 2241.309204] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.309604] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 394.815s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.309815] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Acquiring lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2241.310025] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.310194] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2241.312630] env[67169]: INFO nova.compute.manager [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Terminating instance [ 2241.314244] env[67169]: DEBUG nova.compute.manager [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2241.314431] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2241.314692] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c9989405-c729-4962-adea-fd9f3aeaa5a2 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.324154] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9ed05e2-8554-4478-b1c0-6738163e8d9d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2241.351078] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6663b166-0d24-45a7-8c2c-e4e68dbe0005 could not be found. [ 2241.351275] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2241.351447] env[67169]: INFO nova.compute.manager [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2241.351686] env[67169]: DEBUG oslo.service.loopingcall [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2241.351906] env[67169]: DEBUG nova.compute.manager [-] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2241.352011] env[67169]: DEBUG nova.network.neutron [-] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2241.373911] env[67169]: DEBUG nova.network.neutron [-] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2241.381639] env[67169]: INFO nova.compute.manager [-] [instance: 6663b166-0d24-45a7-8c2c-e4e68dbe0005] Took 0.03 seconds to deallocate network for instance. [ 2241.462390] env[67169]: DEBUG oslo_concurrency.lockutils [None req-7caf859a-096e-4f39-81d0-68bcaa845ae8 tempest-InstanceActionsNegativeTestJSON-1517850368 tempest-InstanceActionsNegativeTestJSON-1517850368-project-member] Lock "6663b166-0d24-45a7-8c2c-e4e68dbe0005" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.153s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2283.659972] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2285.883966] env[67169]: WARNING oslo_vmware.rw_handles [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2285.883966] env[67169]: ERROR oslo_vmware.rw_handles [ 2285.883966] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/df91c7ab-6f98-4b81-a0c9-df82910147db/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2285.886151] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2285.886404] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Copying Virtual Disk [datastore2] vmware_temp/df91c7ab-6f98-4b81-a0c9-df82910147db/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/df91c7ab-6f98-4b81-a0c9-df82910147db/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2285.886710] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5aef1562-2197-4be3-a81c-6832ad80a429 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2285.896024] env[67169]: DEBUG oslo_vmware.api [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 2285.896024] env[67169]: value = "task-2819280" [ 2285.896024] env[67169]: _type = "Task" [ 2285.896024] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2285.903126] env[67169]: DEBUG oslo_vmware.api [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': task-2819280, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2286.406584] env[67169]: DEBUG oslo_vmware.exceptions [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2286.406809] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2286.407398] env[67169]: ERROR nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2286.407398] env[67169]: Faults: ['InvalidArgument'] [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Traceback (most recent call last): [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] yield resources [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] self.driver.spawn(context, instance, image_meta, [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] self._fetch_image_if_missing(context, vi) [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] image_cache(vi, tmp_image_ds_loc) [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] vm_util.copy_virtual_disk( [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] session._wait_for_task(vmdk_copy_task) [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] return self.wait_for_task(task_ref) [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] return evt.wait() [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] result = hub.switch() [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] return self.greenlet.switch() [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] self.f(*self.args, **self.kw) [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] raise exceptions.translate_fault(task_info.error) [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Faults: ['InvalidArgument'] [ 2286.407398] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] [ 2286.408398] env[67169]: INFO nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Terminating instance [ 2286.409235] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2286.409441] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2286.409679] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-936886d2-026b-4805-a680-493170118241 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.411803] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2286.411994] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2286.412694] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-009fd0df-1b48-4e2c-9e2d-0ab8401fe7ed {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.420343] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2286.420550] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-415f4558-193a-4175-94f0-069e5d3db92a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.422858] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2286.423040] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2286.423680] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-85daaf2d-1f86-4457-b979-7442175a6508 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.428029] env[67169]: DEBUG oslo_vmware.api [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 2286.428029] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52e1cc93-5006-4a18-fef4-9f9540ff3a85" [ 2286.428029] env[67169]: _type = "Task" [ 2286.428029] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2286.435168] env[67169]: DEBUG oslo_vmware.api [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52e1cc93-5006-4a18-fef4-9f9540ff3a85, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2286.484247] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2286.484458] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2286.484663] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Deleting the datastore file [datastore2] 115d6c00-4259-4e87-aa00-90b576a63535 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2286.484930] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a6a71122-1d2a-4b48-8539-0dede67108c7 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.491616] env[67169]: DEBUG oslo_vmware.api [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for the task: (returnval){ [ 2286.491616] env[67169]: value = "task-2819282" [ 2286.491616] env[67169]: _type = "Task" [ 2286.491616] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2286.498971] env[67169]: DEBUG oslo_vmware.api [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': task-2819282, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2286.938269] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2286.938612] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating directory with path [datastore2] vmware_temp/2bc436c1-aa78-4274-b2c3-d99ca50bcb7b/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2286.938793] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3460fb5b-928c-4001-8414-056c0e5196f4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.950292] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Created directory with path [datastore2] vmware_temp/2bc436c1-aa78-4274-b2c3-d99ca50bcb7b/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2286.950498] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Fetch image to [datastore2] vmware_temp/2bc436c1-aa78-4274-b2c3-d99ca50bcb7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2286.950752] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/2bc436c1-aa78-4274-b2c3-d99ca50bcb7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2286.951411] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57e04ffe-6d68-4a60-a67b-cb76342acb1d {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.958161] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5701812c-d2f1-497e-80b8-a4fd16244b3e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.966922] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b7a438a-e385-483b-905d-36a8e2c830df {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.999490] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89e63f13-cd3b-45b6-9007-d32c404e4174 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2287.007949] env[67169]: DEBUG oslo_vmware.api [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Task: {'id': task-2819282, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077386} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2287.008459] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2287.008731] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2287.008826] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2287.008986] env[67169]: INFO nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2287.010490] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bf7d671c-e112-498b-8bbd-0d2d97cbf5f5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2287.012364] env[67169]: DEBUG nova.compute.claims [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2287.012536] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2287.013046] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2287.041303] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2287.092195] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2bc436c1-aa78-4274-b2c3-d99ca50bcb7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2287.150486] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2287.150680] env[67169]: DEBUG oslo_vmware.rw_handles [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2bc436c1-aa78-4274-b2c3-d99ca50bcb7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2287.202167] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5df4a7b7-ece8-4033-9901-a2ebd0c317c4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2287.209652] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-756d874a-a214-4ffd-99b1-6c658237220a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2287.239236] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f545b5a3-a149-4f99-b51c-b20dc8efc795 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2287.246036] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cad34e98-8c3c-448b-b80e-ce01a99b57bb {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2287.258577] env[67169]: DEBUG nova.compute.provider_tree [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2287.268611] env[67169]: DEBUG nova.scheduler.client.report [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2287.284098] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.271s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2287.284591] env[67169]: ERROR nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2287.284591] env[67169]: Faults: ['InvalidArgument'] [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Traceback (most recent call last): [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] self.driver.spawn(context, instance, image_meta, [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] self._fetch_image_if_missing(context, vi) [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] image_cache(vi, tmp_image_ds_loc) [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] vm_util.copy_virtual_disk( [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] session._wait_for_task(vmdk_copy_task) [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] return self.wait_for_task(task_ref) [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] return evt.wait() [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] result = hub.switch() [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] return self.greenlet.switch() [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] self.f(*self.args, **self.kw) [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] raise exceptions.translate_fault(task_info.error) [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Faults: ['InvalidArgument'] [ 2287.284591] env[67169]: ERROR nova.compute.manager [instance: 115d6c00-4259-4e87-aa00-90b576a63535] [ 2287.285399] env[67169]: DEBUG nova.compute.utils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2287.286621] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Build of instance 115d6c00-4259-4e87-aa00-90b576a63535 was re-scheduled: A specified parameter was not correct: fileType [ 2287.286621] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2287.286992] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2287.287178] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2287.287349] env[67169]: DEBUG nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2287.287510] env[67169]: DEBUG nova.network.neutron [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2287.561402] env[67169]: DEBUG nova.network.neutron [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2287.573599] env[67169]: INFO nova.compute.manager [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Took 0.29 seconds to deallocate network for instance. [ 2287.659221] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2287.659519] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2287.659709] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2287.673682] env[67169]: INFO nova.scheduler.client.report [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Deleted allocations for instance 115d6c00-4259-4e87-aa00-90b576a63535 [ 2287.681558] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2287.681718] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2287.681846] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2287.681971] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2287.682107] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2287.682230] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2287.682352] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2287.695667] env[67169]: DEBUG oslo_concurrency.lockutils [None req-ca15473a-8330-4e53-b6cb-311183da5250 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "115d6c00-4259-4e87-aa00-90b576a63535" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 573.570s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2287.695822] env[67169]: DEBUG oslo_concurrency.lockutils [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "115d6c00-4259-4e87-aa00-90b576a63535" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 377.781s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2287.696150] env[67169]: DEBUG oslo_concurrency.lockutils [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "115d6c00-4259-4e87-aa00-90b576a63535-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2287.696360] env[67169]: DEBUG oslo_concurrency.lockutils [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "115d6c00-4259-4e87-aa00-90b576a63535-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2287.696528] env[67169]: DEBUG oslo_concurrency.lockutils [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "115d6c00-4259-4e87-aa00-90b576a63535-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2287.698402] env[67169]: INFO nova.compute.manager [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Terminating instance [ 2287.701779] env[67169]: DEBUG nova.compute.manager [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2287.701779] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2287.702699] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d4c7a0a7-1e8f-4a71-9b94-78402d7598ae {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2287.711194] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c55073a-2640-43c5-a1db-5137582b6cba {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2287.737497] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 115d6c00-4259-4e87-aa00-90b576a63535 could not be found. [ 2287.737757] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2287.737882] env[67169]: INFO nova.compute.manager [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2287.738134] env[67169]: DEBUG oslo.service.loopingcall [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2287.738619] env[67169]: DEBUG nova.compute.manager [-] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2287.738714] env[67169]: DEBUG nova.network.neutron [-] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2287.762149] env[67169]: DEBUG nova.network.neutron [-] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2287.769818] env[67169]: INFO nova.compute.manager [-] [instance: 115d6c00-4259-4e87-aa00-90b576a63535] Took 0.03 seconds to deallocate network for instance. [ 2287.853743] env[67169]: DEBUG oslo_concurrency.lockutils [None req-570e5988-7411-48a8-8041-cfa1d5b638f2 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Lock "115d6c00-4259-4e87-aa00-90b576a63535" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.158s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2288.658522] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2288.658895] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2291.659337] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2292.659770] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2292.660135] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2292.660235] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2292.675330] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2292.675606] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2292.675795] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2292.675961] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2292.677141] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dedca0a0-392c-43b0-94e6-b486e19a7add {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.686066] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-694f876a-7ade-42a1-aa00-49dfce6fa0e3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.700045] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d25393e-464c-4cbc-a71f-2d53f4af1c71 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.706768] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef72436a-35e4-43b8-a7ef-8b2e44b9f05e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.737653] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181026MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2292.737849] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2292.738019] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2292.801453] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.801641] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.801796] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1ece950d-8b7f-4462-8138-10cbf43149ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.801932] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance b554602b-2aae-4c1b-9385-4bef16a1dc5a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.802066] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7486bbc9-6aa3-4880-9662-b3451b400bf8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.802189] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2292.802377] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2292.802516] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2292.893516] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36640a84-d0a5-4595-bb5f-53c13b9839c3 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.901431] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-972dc9c8-f895-4e41-8fb3-4fca0a1d2758 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.931973] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3341743-ef13-464c-bf4e-e82f097af3b4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.939504] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7ece2bd-2a48-4711-b834-0cc6ff43cba6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2292.954024] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2292.964811] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2292.980548] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2292.980798] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2294.975832] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2294.976209] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2304.352718] env[67169]: DEBUG oslo_concurrency.lockutils [None req-8c76a667-d885-4962-a360-6d2c54b91836 tempest-ServerDiskConfigTestJSON-907081631 tempest-ServerDiskConfigTestJSON-907081631-project-member] Acquiring lock "b554602b-2aae-4c1b-9385-4bef16a1dc5a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2335.902616] env[67169]: WARNING oslo_vmware.rw_handles [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2335.902616] env[67169]: ERROR oslo_vmware.rw_handles [ 2335.903136] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/2bc436c1-aa78-4274-b2c3-d99ca50bcb7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2335.905388] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2335.905648] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Copying Virtual Disk [datastore2] vmware_temp/2bc436c1-aa78-4274-b2c3-d99ca50bcb7b/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/2bc436c1-aa78-4274-b2c3-d99ca50bcb7b/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2335.905931] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-55b32734-0506-4bac-98c0-2449f5142a93 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.913679] env[67169]: DEBUG oslo_vmware.api [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 2335.913679] env[67169]: value = "task-2819283" [ 2335.913679] env[67169]: _type = "Task" [ 2335.913679] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2335.921464] env[67169]: DEBUG oslo_vmware.api [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819283, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2336.424173] env[67169]: DEBUG oslo_vmware.exceptions [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2336.424423] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2336.424984] env[67169]: ERROR nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2336.424984] env[67169]: Faults: ['InvalidArgument'] [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Traceback (most recent call last): [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] yield resources [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] self.driver.spawn(context, instance, image_meta, [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] self._fetch_image_if_missing(context, vi) [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] image_cache(vi, tmp_image_ds_loc) [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] vm_util.copy_virtual_disk( [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] session._wait_for_task(vmdk_copy_task) [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] return self.wait_for_task(task_ref) [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] return evt.wait() [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] result = hub.switch() [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] return self.greenlet.switch() [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] self.f(*self.args, **self.kw) [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] raise exceptions.translate_fault(task_info.error) [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Faults: ['InvalidArgument'] [ 2336.424984] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] [ 2336.425775] env[67169]: INFO nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Terminating instance [ 2336.426832] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2336.427051] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2336.427286] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-51458536-e2a0-4b3d-b60b-1151fbafc1c4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.429368] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2336.429553] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2336.430289] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0e5916d-8a87-43d5-a97b-1db4670a84c0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.437106] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2336.437337] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4c75110a-2df4-446e-a5b9-8a1590735ae4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.439441] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2336.439615] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2336.440553] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1bf22c79-120a-4ead-9929-503e6e5f1d75 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.444979] env[67169]: DEBUG oslo_vmware.api [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 2336.444979] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52def8fb-299b-713c-fead-394d0be4b193" [ 2336.444979] env[67169]: _type = "Task" [ 2336.444979] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2336.451955] env[67169]: DEBUG oslo_vmware.api [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]52def8fb-299b-713c-fead-394d0be4b193, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2336.505053] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2336.505053] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2336.505247] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleting the datastore file [datastore2] 220daf5b-b4fd-49b0-9098-c1f846d6e552 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2336.505369] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fa681a67-fdc2-4116-b8d2-8d5473e8d039 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.510980] env[67169]: DEBUG oslo_vmware.api [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for the task: (returnval){ [ 2336.510980] env[67169]: value = "task-2819285" [ 2336.510980] env[67169]: _type = "Task" [ 2336.510980] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2336.518202] env[67169]: DEBUG oslo_vmware.api [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819285, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2336.955368] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2336.955766] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating directory with path [datastore2] vmware_temp/c2a3097e-501a-464d-8ab1-272408c39183/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2336.955858] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ca9891a2-2751-4bb3-83fe-9d0ed0c6267e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.967290] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Created directory with path [datastore2] vmware_temp/c2a3097e-501a-464d-8ab1-272408c39183/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2336.967470] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Fetch image to [datastore2] vmware_temp/c2a3097e-501a-464d-8ab1-272408c39183/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2336.967639] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/c2a3097e-501a-464d-8ab1-272408c39183/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2336.968344] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4836106-6e44-46ee-af43-0c58a7c2d2e0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.974718] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-786f414a-5cee-4d3e-aeb6-1b2cdb5cbfb6 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.984552] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74f21d95-70db-4f9f-89b9-57a04fdf344b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.016873] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3fbdb72-46f4-46d8-9e54-1d8dc8cacbb0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.024790] env[67169]: DEBUG oslo_vmware.api [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Task: {'id': task-2819285, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064478} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2337.024997] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-df49b252-f10d-4574-bbe3-b57e3b468d8f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.026575] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2337.026760] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2337.026926] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2337.027134] env[67169]: INFO nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2337.029143] env[67169]: DEBUG nova.compute.claims [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2337.029320] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2337.029528] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2337.047934] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2337.101170] env[67169]: DEBUG oslo_vmware.rw_handles [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c2a3097e-501a-464d-8ab1-272408c39183/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2337.160795] env[67169]: DEBUG oslo_vmware.rw_handles [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2337.160984] env[67169]: DEBUG oslo_vmware.rw_handles [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c2a3097e-501a-464d-8ab1-272408c39183/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2337.202939] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd5c6acc-f9d0-40ec-a449-0628620f40cc {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.211068] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df5ee892-cd53-4173-b54f-93a1d96a4d7e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.241607] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fda0f2b8-1872-45f5-aa7a-071cf149f19f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.248924] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b673bf8-0eb8-4769-9880-40ddb5da65c1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.262090] env[67169]: DEBUG nova.compute.provider_tree [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2337.270754] env[67169]: DEBUG nova.scheduler.client.report [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2337.284693] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.255s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2337.285248] env[67169]: ERROR nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2337.285248] env[67169]: Faults: ['InvalidArgument'] [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Traceback (most recent call last): [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] self.driver.spawn(context, instance, image_meta, [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] self._fetch_image_if_missing(context, vi) [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] image_cache(vi, tmp_image_ds_loc) [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] vm_util.copy_virtual_disk( [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] session._wait_for_task(vmdk_copy_task) [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] return self.wait_for_task(task_ref) [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] return evt.wait() [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] result = hub.switch() [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] return self.greenlet.switch() [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] self.f(*self.args, **self.kw) [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] raise exceptions.translate_fault(task_info.error) [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Faults: ['InvalidArgument'] [ 2337.285248] env[67169]: ERROR nova.compute.manager [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] [ 2337.285982] env[67169]: DEBUG nova.compute.utils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2337.287769] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Build of instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 was re-scheduled: A specified parameter was not correct: fileType [ 2337.287769] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2337.288163] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2337.288336] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2337.288507] env[67169]: DEBUG nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2337.288667] env[67169]: DEBUG nova.network.neutron [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2337.591636] env[67169]: DEBUG nova.network.neutron [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2337.603865] env[67169]: INFO nova.compute.manager [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Took 0.31 seconds to deallocate network for instance. [ 2337.696142] env[67169]: INFO nova.scheduler.client.report [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Deleted allocations for instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 [ 2337.713254] env[67169]: DEBUG oslo_concurrency.lockutils [None req-1e4472df-f6c8-4cd8-9c36-fe0c36e8321c tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "220daf5b-b4fd-49b0-9098-c1f846d6e552" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 568.849s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2337.713489] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "220daf5b-b4fd-49b0-9098-c1f846d6e552" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 372.504s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2337.713709] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "220daf5b-b4fd-49b0-9098-c1f846d6e552-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2337.713911] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "220daf5b-b4fd-49b0-9098-c1f846d6e552-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2337.714173] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "220daf5b-b4fd-49b0-9098-c1f846d6e552-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2337.716014] env[67169]: INFO nova.compute.manager [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Terminating instance [ 2337.717728] env[67169]: DEBUG nova.compute.manager [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2337.717920] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2337.718390] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ca32ac2f-8920-46d7-b7c6-03a5c51b461f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.727196] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f0f2852-5a2c-46c3-b26b-e2e8db2eaa4e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.752685] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 220daf5b-b4fd-49b0-9098-c1f846d6e552 could not be found. [ 2337.752914] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2337.753107] env[67169]: INFO nova.compute.manager [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2337.753347] env[67169]: DEBUG oslo.service.loopingcall [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2337.753553] env[67169]: DEBUG nova.compute.manager [-] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2337.753646] env[67169]: DEBUG nova.network.neutron [-] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2337.776011] env[67169]: DEBUG nova.network.neutron [-] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2337.784262] env[67169]: INFO nova.compute.manager [-] [instance: 220daf5b-b4fd-49b0-9098-c1f846d6e552] Took 0.03 seconds to deallocate network for instance. [ 2337.864669] env[67169]: DEBUG oslo_concurrency.lockutils [None req-26bdb564-efe7-44b4-9a28-af7e1fb00cfa tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Lock "220daf5b-b4fd-49b0-9098-c1f846d6e552" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.151s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2344.660352] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2345.660194] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2348.667078] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2348.667078] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Starting heal instance info cache {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2348.667458] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Rebuilding the list of instances to heal {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2348.683975] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2348.684219] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2348.684354] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: b554602b-2aae-4c1b-9385-4bef16a1dc5a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2348.684476] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 7486bbc9-6aa3-4880-9662-b3451b400bf8] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2348.684605] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] [instance: 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a] Skipping network cache update for instance because it is Building. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2348.684728] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Didn't find any instances for network info cache update. {{(pid=67169) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2349.659313] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2349.659699] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67169) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2353.659492] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2354.659313] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2354.659579] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2354.659869] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2354.671402] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2354.671613] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2354.671781] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2354.671935] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67169) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2354.673100] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fde7faa-ceec-4bb0-9e06-dc9ade1861ff {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.681870] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0085471-76a0-402b-ab57-06914ef89ee5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.695727] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb0b3ec0-b64d-4de0-b6b7-cbd21ff4ebe4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.701780] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0971ab27-d8e3-4d4a-81ee-ea79ef94d6f5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.729849] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181014MB free_disk=171GB free_vcpus=48 pci_devices=None {{(pid=67169) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2354.729978] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2354.730143] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2354.784217] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2354.784392] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 1ece950d-8b7f-4462-8138-10cbf43149ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2354.784522] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance b554602b-2aae-4c1b-9385-4bef16a1dc5a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2354.784644] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 7486bbc9-6aa3-4880-9662-b3451b400bf8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2354.784761] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Instance 0dcfb66e-080f-4b51-9f3d-1a29aea0af4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67169) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2354.784937] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2354.785108] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=67169) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2354.846297] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09b82739-994a-46c9-b08e-55d8cbacbb4f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.853774] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97eee837-e975-4f45-8f56-dace5bc24d18 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.882472] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eb69e5a-8ad6-47e5-82a9-125e791d50a1 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.889177] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-735d74cc-ecb8-4a25-9fb8-32bebfcd23ca {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.901868] env[67169]: DEBUG nova.compute.provider_tree [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2354.910230] env[67169]: DEBUG nova.scheduler.client.report [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2354.922512] env[67169]: DEBUG nova.compute.resource_tracker [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67169) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2354.922690] env[67169]: DEBUG oslo_concurrency.lockutils [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2355.922653] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2356.653297] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2357.689346] env[67169]: DEBUG oslo_concurrency.lockutils [None req-d69e08a4-ba1f-49bf-9457-d791fc93b427 tempest-DeleteServersTestJSON-867121436 tempest-DeleteServersTestJSON-867121436-project-member] Acquiring lock "7486bbc9-6aa3-4880-9662-b3451b400bf8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2359.653613] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2360.660070] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2360.660070] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances with incomplete migration {{(pid=67169) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2365.667677] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2365.668279] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Cleaning up deleted instances {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2365.679840] env[67169]: DEBUG nova.compute.manager [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] There are 0 instances to clean {{(pid=67169) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2382.343650] env[67169]: WARNING oslo_vmware.rw_handles [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles response.begin() [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2382.343650] env[67169]: ERROR oslo_vmware.rw_handles [ 2382.344263] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Downloaded image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to vmware_temp/c2a3097e-501a-464d-8ab1-272408c39183/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2382.346356] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Caching image {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2382.346602] env[67169]: DEBUG nova.virt.vmwareapi.vm_util [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Copying Virtual Disk [datastore2] vmware_temp/c2a3097e-501a-464d-8ab1-272408c39183/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk to [datastore2] vmware_temp/c2a3097e-501a-464d-8ab1-272408c39183/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk {{(pid=67169) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2382.346941] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-baaeb3ea-cf17-4ff5-8d08-3fd1a9d45fd5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2382.354247] env[67169]: DEBUG oslo_vmware.api [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 2382.354247] env[67169]: value = "task-2819286" [ 2382.354247] env[67169]: _type = "Task" [ 2382.354247] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2382.361943] env[67169]: DEBUG oslo_vmware.api [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819286, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2382.865091] env[67169]: DEBUG oslo_vmware.exceptions [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Fault InvalidArgument not matched. {{(pid=67169) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2382.865340] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Releasing lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2382.865912] env[67169]: ERROR nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2382.865912] env[67169]: Faults: ['InvalidArgument'] [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Traceback (most recent call last): [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] yield resources [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] self.driver.spawn(context, instance, image_meta, [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] self._fetch_image_if_missing(context, vi) [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] image_cache(vi, tmp_image_ds_loc) [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] vm_util.copy_virtual_disk( [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] session._wait_for_task(vmdk_copy_task) [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] return self.wait_for_task(task_ref) [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] return evt.wait() [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] result = hub.switch() [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] return self.greenlet.switch() [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] self.f(*self.args, **self.kw) [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] raise exceptions.translate_fault(task_info.error) [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Faults: ['InvalidArgument'] [ 2382.865912] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] [ 2382.866869] env[67169]: INFO nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Terminating instance [ 2382.867783] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Acquired lock "[datastore2] devstack-image-cache_base/285931c9-8b83-4997-8c4d-6a79005e36ba/285931c9-8b83-4997-8c4d-6a79005e36ba.vmdk" {{(pid=67169) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2382.868009] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2382.868251] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-98f5cf9d-c486-43db-bd79-683345a6a7e5 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2382.870523] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2382.870734] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2382.871435] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-283f1896-6174-48d0-9f19-9f0063fced47 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2382.878163] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Unregistering the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2382.878358] env[67169]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a8e77b0d-c902-4d0e-994c-a0abc335132a {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2382.880871] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2382.881070] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67169) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2382.882026] env[67169]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-444c3d10-a26b-4342-96c0-18194b43e99c {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2382.886902] env[67169]: DEBUG oslo_vmware.api [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Waiting for the task: (returnval){ [ 2382.886902] env[67169]: value = "session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5212c3da-b4d7-192c-60ab-5c5f9bdc2379" [ 2382.886902] env[67169]: _type = "Task" [ 2382.886902] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2382.893572] env[67169]: DEBUG oslo_vmware.api [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Task: {'id': session[52518c09-e5f2-035d-4dc1-3e29ddf94015]5212c3da-b4d7-192c-60ab-5c5f9bdc2379, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2383.397386] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Preparing fetch location {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2383.397682] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Creating directory with path [datastore2] vmware_temp/34a9d0df-dcc8-4906-98e0-d4f32d8f018a/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2383.397928] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-260d559b-f3cc-4d57-a6fe-9b6919d19732 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2383.420726] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Created directory with path [datastore2] vmware_temp/34a9d0df-dcc8-4906-98e0-d4f32d8f018a/285931c9-8b83-4997-8c4d-6a79005e36ba {{(pid=67169) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2383.420930] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Fetch image to [datastore2] vmware_temp/34a9d0df-dcc8-4906-98e0-d4f32d8f018a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk {{(pid=67169) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2383.421162] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to [datastore2] vmware_temp/34a9d0df-dcc8-4906-98e0-d4f32d8f018a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk on the data store datastore2 {{(pid=67169) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2383.421885] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b392eb28-080d-4923-a574-0c353facd249 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2383.429745] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c74f6a0-e5b0-41f4-bd18-43585414a835 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2383.443547] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9981793-da87-4594-adc7-59139daa91a4 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2383.483252] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4464685-28bb-48a1-8b55-68e2a8b0046f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2383.489964] env[67169]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c74ee791-e656-4144-950c-65c95522a094 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2383.496668] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Unregistered the VM {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2383.496852] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Deleting contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2383.497129] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleting the datastore file [datastore2] 68b94a43-eaa5-4023-8bf5-8cc647c2f098 {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2383.497236] env[67169]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d50551f1-7159-4c77-9ea7-347456c6dd39 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2383.503173] env[67169]: DEBUG oslo_vmware.api [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for the task: (returnval){ [ 2383.503173] env[67169]: value = "task-2819288" [ 2383.503173] env[67169]: _type = "Task" [ 2383.503173] env[67169]: } to complete. {{(pid=67169) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2383.512842] env[67169]: DEBUG oslo_vmware.api [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819288, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2383.514066] env[67169]: DEBUG nova.virt.vmwareapi.images [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] [instance: 1ece950d-8b7f-4462-8138-10cbf43149ee] Downloading image file data 285931c9-8b83-4997-8c4d-6a79005e36ba to the data store datastore2 {{(pid=67169) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2383.564442] env[67169]: DEBUG oslo_vmware.rw_handles [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/34a9d0df-dcc8-4906-98e0-d4f32d8f018a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2383.623056] env[67169]: DEBUG oslo_vmware.rw_handles [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Completed reading data from the image iterator. {{(pid=67169) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2383.623285] env[67169]: DEBUG oslo_vmware.rw_handles [None req-a6acb6ba-6d57-4668-899b-80380a9809d2 tempest-ServerActionsTestOtherB-1774783308 tempest-ServerActionsTestOtherB-1774783308-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/34a9d0df-dcc8-4906-98e0-d4f32d8f018a/285931c9-8b83-4997-8c4d-6a79005e36ba/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67169) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2384.013548] env[67169]: DEBUG oslo_vmware.api [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Task: {'id': task-2819288, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067268} completed successfully. {{(pid=67169) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2384.013801] env[67169]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleted the datastore file {{(pid=67169) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2384.013987] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Deleted contents of the VM from datastore datastore2 {{(pid=67169) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2384.014232] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2384.014474] env[67169]: INFO nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Took 1.14 seconds to destroy the instance on the hypervisor. [ 2384.016520] env[67169]: DEBUG nova.compute.claims [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Aborting claim: {{(pid=67169) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2384.016685] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2384.016892] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2384.128661] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b7c44d7-980e-42e0-82de-8306d6c67307 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.136091] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cb3971c-76b7-40ba-a357-0a88bf4b2b9e {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.165186] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47824ee3-abef-4b48-b5db-1f86ba8a2df9 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.172235] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a512d319-3ae2-48d8-a64a-3bd75339ba2b {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.186350] env[67169]: DEBUG nova.compute.provider_tree [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed in ProviderTree for provider: 6570906a-ac37-4859-b1f2-4bbacc48d3f3 {{(pid=67169) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2384.194450] env[67169]: DEBUG nova.scheduler.client.report [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Inventory has not changed for provider 6570906a-ac37-4859-b1f2-4bbacc48d3f3 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 171, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67169) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2384.207363] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.190s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2384.207883] env[67169]: ERROR nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2384.207883] env[67169]: Faults: ['InvalidArgument'] [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Traceback (most recent call last): [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] self.driver.spawn(context, instance, image_meta, [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] self._fetch_image_if_missing(context, vi) [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] image_cache(vi, tmp_image_ds_loc) [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] vm_util.copy_virtual_disk( [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] session._wait_for_task(vmdk_copy_task) [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] return self.wait_for_task(task_ref) [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] return evt.wait() [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] result = hub.switch() [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] return self.greenlet.switch() [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] self.f(*self.args, **self.kw) [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] raise exceptions.translate_fault(task_info.error) [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Faults: ['InvalidArgument'] [ 2384.207883] env[67169]: ERROR nova.compute.manager [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] [ 2384.208921] env[67169]: DEBUG nova.compute.utils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] VimFaultException {{(pid=67169) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2384.209903] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Build of instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 was re-scheduled: A specified parameter was not correct: fileType [ 2384.209903] env[67169]: Faults: ['InvalidArgument'] {{(pid=67169) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2384.210292] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Unplugging VIFs for instance {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2384.210462] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67169) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2384.210629] env[67169]: DEBUG nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2384.210789] env[67169]: DEBUG nova.network.neutron [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2384.547995] env[67169]: DEBUG nova.network.neutron [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2384.560843] env[67169]: INFO nova.compute.manager [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Took 0.35 seconds to deallocate network for instance. [ 2384.676097] env[67169]: INFO nova.scheduler.client.report [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Deleted allocations for instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 [ 2384.694849] env[67169]: DEBUG oslo_concurrency.lockutils [None req-0dac4879-9808-4647-9123-8b378a5e4530 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 572.312s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2384.695127] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 375.309s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2384.695354] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Acquiring lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2384.695559] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2384.695726] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2384.698028] env[67169]: INFO nova.compute.manager [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Terminating instance [ 2384.699281] env[67169]: DEBUG nova.compute.manager [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Start destroying the instance on the hypervisor. {{(pid=67169) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2384.699472] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Destroying instance {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2384.699935] env[67169]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c5dc8553-0533-4a27-980a-674f596384d0 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.709197] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23e0654f-1388-4dd1-ab71-4c54fc54ac2f {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.734526] env[67169]: WARNING nova.virt.vmwareapi.vmops [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 68b94a43-eaa5-4023-8bf5-8cc647c2f098 could not be found. [ 2384.734736] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Instance destroyed {{(pid=67169) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2384.734915] env[67169]: INFO nova.compute.manager [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2384.735175] env[67169]: DEBUG oslo.service.loopingcall [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67169) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2384.735393] env[67169]: DEBUG nova.compute.manager [-] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Deallocating network for instance {{(pid=67169) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2384.735484] env[67169]: DEBUG nova.network.neutron [-] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] deallocate_for_instance() {{(pid=67169) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2384.763285] env[67169]: DEBUG nova.network.neutron [-] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Updating instance_info_cache with network_info: [] {{(pid=67169) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2384.771027] env[67169]: INFO nova.compute.manager [-] [instance: 68b94a43-eaa5-4023-8bf5-8cc647c2f098] Took 0.04 seconds to deallocate network for instance. [ 2384.847903] env[67169]: DEBUG oslo_concurrency.lockutils [None req-a8498e40-29cf-4df3-adff-05455c8fbe22 tempest-ImagesTestJSON-1590952821 tempest-ImagesTestJSON-1590952821-project-member] Lock "68b94a43-eaa5-4023-8bf5-8cc647c2f098" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.153s {{(pid=67169) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2386.509064] env[67169]: DEBUG oslo_service.periodic_task [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67169) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2386.509064] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Getting list of instances from cluster (obj){ [ 2386.509064] env[67169]: value = "domain-c8" [ 2386.509064] env[67169]: _type = "ClusterComputeResource" [ 2386.509064] env[67169]: } {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2386.509962] env[67169]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4efebdb4-1781-4800-89d3-3ff7846ce935 {{(pid=67169) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2386.522541] env[67169]: DEBUG nova.virt.vmwareapi.vmops [None req-021b568e-0fd7-4ea2-9d37-7744c8060ef2 None None] Got total of 4 instances {{(pid=67169) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}}