[ 549.802754] env[67849]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 550.436872] env[67899]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 551.774733] env[67899]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=67899) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 551.775115] env[67899]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=67899) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 551.775260] env[67899]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=67899) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 551.775521] env[67899]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 551.975934] env[67899]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=67899) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 551.986334] env[67899]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=67899) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 552.096738] env[67899]: INFO nova.virt.driver [None req-febbcd7b-9d36-4c5b-9db3-0c3cab291f91 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 552.169391] env[67899]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 552.169578] env[67899]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 552.169682] env[67899]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=67899) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 555.072593] env[67899]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-a35f6df7-6dca-482c-a1d0-263e3bd8efca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.089063] env[67899]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=67899) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 555.089290] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-0bcc2494-2a01-4ddf-836a-5440c22c4001 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.113639] env[67899]: INFO oslo_vmware.api [-] Successfully established new session; session ID is c9c40. [ 555.113831] env[67899]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.944s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 555.114382] env[67899]: INFO nova.virt.vmwareapi.driver [None req-febbcd7b-9d36-4c5b-9db3-0c3cab291f91 None None] VMware vCenter version: 7.0.3 [ 555.117872] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ec8f580-877f-43d5-82d0-087add0b45b7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.135341] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c38807ec-ef6d-44f5-b153-70a9a3088254 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.141240] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f30a09-01ab-4b18-b40e-9f4012bf4fe0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.147660] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56c1ab53-9f23-4ff0-aabd-6a5df5387598 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.161110] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2886a787-c8cc-4a3c-aef6-f49ced5f15d4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.166855] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-695b9e48-b24e-45cd-95a2-f6dfb3fcf7cc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.197394] env[67899]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-bde2f3b7-1291-45c4-927a-486b7b104a71 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.202223] env[67899]: DEBUG nova.virt.vmwareapi.driver [None req-febbcd7b-9d36-4c5b-9db3-0c3cab291f91 None None] Extension org.openstack.compute already exists. {{(pid=67899) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 555.204830] env[67899]: INFO nova.compute.provider_config [None req-febbcd7b-9d36-4c5b-9db3-0c3cab291f91 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 555.224062] env[67899]: DEBUG nova.context [None req-febbcd7b-9d36-4c5b-9db3-0c3cab291f91 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),e4a978d9-29ab-4e43-8f77-18ddaab56b18(cell1) {{(pid=67899) load_cells /opt/stack/nova/nova/context.py:464}} [ 555.226108] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 555.226338] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 555.227030] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 555.227442] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Acquiring lock "e4a978d9-29ab-4e43-8f77-18ddaab56b18" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 555.227636] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Lock "e4a978d9-29ab-4e43-8f77-18ddaab56b18" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 555.228778] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Lock "e4a978d9-29ab-4e43-8f77-18ddaab56b18" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 555.254987] env[67899]: INFO dbcounter [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Registered counter for database nova_cell0 [ 555.263803] env[67899]: INFO dbcounter [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Registered counter for database nova_cell1 [ 555.266738] env[67899]: DEBUG oslo_db.sqlalchemy.engines [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67899) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 555.267102] env[67899]: DEBUG oslo_db.sqlalchemy.engines [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67899) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 555.271596] env[67899]: DEBUG dbcounter [-] [67899] Writer thread running {{(pid=67899) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 555.272475] env[67899]: DEBUG dbcounter [-] [67899] Writer thread running {{(pid=67899) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 555.274947] env[67899]: ERROR nova.db.main.api [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 555.274947] env[67899]: result = function(*args, **kwargs) [ 555.274947] env[67899]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 555.274947] env[67899]: return func(*args, **kwargs) [ 555.274947] env[67899]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 555.274947] env[67899]: result = fn(*args, **kwargs) [ 555.274947] env[67899]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 555.274947] env[67899]: return f(*args, **kwargs) [ 555.274947] env[67899]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 555.274947] env[67899]: return db.service_get_minimum_version(context, binaries) [ 555.274947] env[67899]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 555.274947] env[67899]: _check_db_access() [ 555.274947] env[67899]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 555.274947] env[67899]: stacktrace = ''.join(traceback.format_stack()) [ 555.274947] env[67899]: [ 555.275961] env[67899]: ERROR nova.db.main.api [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 555.275961] env[67899]: result = function(*args, **kwargs) [ 555.275961] env[67899]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 555.275961] env[67899]: return func(*args, **kwargs) [ 555.275961] env[67899]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 555.275961] env[67899]: result = fn(*args, **kwargs) [ 555.275961] env[67899]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 555.275961] env[67899]: return f(*args, **kwargs) [ 555.275961] env[67899]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 555.275961] env[67899]: return db.service_get_minimum_version(context, binaries) [ 555.275961] env[67899]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 555.275961] env[67899]: _check_db_access() [ 555.275961] env[67899]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 555.275961] env[67899]: stacktrace = ''.join(traceback.format_stack()) [ 555.275961] env[67899]: [ 555.276393] env[67899]: WARNING nova.objects.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Failed to get minimum service version for cell e4a978d9-29ab-4e43-8f77-18ddaab56b18 [ 555.276521] env[67899]: WARNING nova.objects.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 555.276938] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Acquiring lock "singleton_lock" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 555.277116] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Acquired lock "singleton_lock" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 555.277369] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Releasing lock "singleton_lock" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 555.277718] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Full set of CONF: {{(pid=67899) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 555.277867] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ******************************************************************************** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 555.277998] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] Configuration options gathered from: {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 555.278148] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 555.278345] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 555.278476] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ================================================================================ {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 555.278732] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] allow_resize_to_same_host = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.278920] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] arq_binding_timeout = 300 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.279108] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] backdoor_port = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.279290] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] backdoor_socket = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.279514] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] block_device_allocate_retries = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.279695] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] block_device_allocate_retries_interval = 3 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.279872] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cert = self.pem {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.280087] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.280296] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute_monitors = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.280504] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] config_dir = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.280685] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] config_drive_format = iso9660 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.280823] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.280988] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] config_source = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.281169] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] console_host = devstack {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.281337] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] control_exchange = nova {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.281499] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cpu_allocation_ratio = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.281679] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] daemon = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.281864] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] debug = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.282033] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] default_access_ip_network_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.282208] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] default_availability_zone = nova {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.282394] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] default_ephemeral_format = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.282564] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] default_green_pool_size = 1000 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.282816] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.282985] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] default_schedule_zone = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.283165] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] disk_allocation_ratio = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.283330] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] enable_new_services = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.283510] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] enabled_apis = ['osapi_compute'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.283677] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] enabled_ssl_apis = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.283838] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] flat_injected = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.283998] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] force_config_drive = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.284172] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] force_raw_images = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.284342] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] graceful_shutdown_timeout = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.284503] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] heal_instance_info_cache_interval = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.284740] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] host = cpu-1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.284926] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] initial_cpu_allocation_ratio = 4.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.285108] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] initial_disk_allocation_ratio = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.285279] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] initial_ram_allocation_ratio = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.285528] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.285700] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] instance_build_timeout = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.285863] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] instance_delete_interval = 300 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.286041] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] instance_format = [instance: %(uuid)s] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.286215] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] instance_name_template = instance-%08x {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.286378] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] instance_usage_audit = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.286548] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] instance_usage_audit_period = month {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.286714] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.286881] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] instances_path = /opt/stack/data/nova/instances {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.287061] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] internal_service_availability_zone = internal {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.287224] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] key = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.287386] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] live_migration_retry_count = 30 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.287549] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] log_config_append = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.287732] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.287911] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] log_dir = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.288091] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] log_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.288225] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] log_options = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.288410] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] log_rotate_interval = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.288591] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] log_rotate_interval_type = days {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.288762] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] log_rotation_type = none {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.288894] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.289031] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.289234] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.289412] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.289542] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.289706] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] long_rpc_timeout = 1800 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.289867] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] max_concurrent_builds = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.290034] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] max_concurrent_live_migrations = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.290220] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] max_concurrent_snapshots = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.290386] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] max_local_block_devices = 3 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.290547] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] max_logfile_count = 30 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.290704] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] max_logfile_size_mb = 200 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.290863] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] maximum_instance_delete_attempts = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.291041] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] metadata_listen = 0.0.0.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.291214] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] metadata_listen_port = 8775 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.291404] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] metadata_workers = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.291580] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] migrate_max_retries = -1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.291749] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] mkisofs_cmd = genisoimage {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.291952] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] my_block_storage_ip = 10.180.1.21 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.292098] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] my_ip = 10.180.1.21 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.292274] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] network_allocate_retries = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.292465] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.292635] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] osapi_compute_listen = 0.0.0.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.292797] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] osapi_compute_listen_port = 8774 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.292963] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] osapi_compute_unique_server_name_scope = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.293145] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] osapi_compute_workers = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.293310] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] password_length = 12 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.293473] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] periodic_enable = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.293633] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] periodic_fuzzy_delay = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.293838] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] pointer_model = usbtablet {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.293962] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] preallocate_images = none {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.294135] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] publish_errors = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.294268] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] pybasedir = /opt/stack/nova {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.294446] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ram_allocation_ratio = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.294623] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] rate_limit_burst = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.294794] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] rate_limit_except_level = CRITICAL {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.294955] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] rate_limit_interval = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.295132] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] reboot_timeout = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.295296] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] reclaim_instance_interval = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.295454] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] record = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.295624] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] reimage_timeout_per_gb = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.295791] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] report_interval = 120 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.295951] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] rescue_timeout = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.296123] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] reserved_host_cpus = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.296286] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] reserved_host_disk_mb = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.296445] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] reserved_host_memory_mb = 512 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.296602] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] reserved_huge_pages = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.296758] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] resize_confirm_window = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.296915] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] resize_fs_using_block_device = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.297082] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] resume_guests_state_on_host_boot = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.297279] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.297449] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] rpc_response_timeout = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.297609] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] run_external_periodic_tasks = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.297777] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] running_deleted_instance_action = reap {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.297937] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] running_deleted_instance_poll_interval = 1800 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.298112] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] running_deleted_instance_timeout = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.298277] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler_instance_sync_interval = 120 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.298445] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_down_time = 720 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.298612] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] servicegroup_driver = db {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.298772] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] shelved_offload_time = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.298930] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] shelved_poll_interval = 3600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.299112] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] shutdown_timeout = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.299302] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] source_is_ipv6 = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.299466] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ssl_only = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.299713] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.299879] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] sync_power_state_interval = 600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.300053] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] sync_power_state_pool_size = 1000 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.300265] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] syslog_log_facility = LOG_USER {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.300435] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] tempdir = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.300599] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] timeout_nbd = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.300767] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] transport_url = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.300926] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] update_resources_interval = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.301098] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] use_cow_images = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.301261] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] use_eventlog = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.301425] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] use_journal = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.301579] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] use_json = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.301736] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] use_rootwrap_daemon = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.301891] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] use_stderr = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.302359] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] use_syslog = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.302359] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vcpu_pin_set = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.302359] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plugging_is_fatal = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.302540] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plugging_timeout = 300 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.302683] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] virt_mkfs = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.302845] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] volume_usage_poll_interval = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.303021] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] watch_log_file = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.303197] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] web = /usr/share/spice-html5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 555.303400] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_concurrency.disable_process_locking = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.303692] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.303871] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.304049] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.304222] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_metrics.metrics_process_name = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.304393] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.304559] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.304737] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.auth_strategy = keystone {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.304903] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.compute_link_prefix = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.305095] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.305273] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.dhcp_domain = novalocal {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.305441] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.enable_instance_password = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.305604] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.glance_link_prefix = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.305766] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.305938] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.instance_list_cells_batch_strategy = distributed {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.306115] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.instance_list_per_project_cells = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.306303] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.list_records_by_skipping_down_cells = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.306478] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.local_metadata_per_cell = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.306649] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.max_limit = 1000 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.306818] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.metadata_cache_expiration = 15 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.306993] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.neutron_default_tenant_id = default {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.307180] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.use_forwarded_for = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.307345] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.use_neutron_default_nets = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.307513] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.307675] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.vendordata_dynamic_failure_fatal = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.307843] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.308025] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.vendordata_dynamic_ssl_certfile = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.308205] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.vendordata_dynamic_targets = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.308382] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.vendordata_jsonfile_path = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.308546] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api.vendordata_providers = ['StaticJSON'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.308737] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.backend = dogpile.cache.memcached {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.308903] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.backend_argument = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.309083] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.config_prefix = cache.oslo {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.309296] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.dead_timeout = 60.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.309480] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.debug_cache_backend = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.309645] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.enable_retry_client = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.309808] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.enable_socket_keepalive = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.309975] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.enabled = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.310177] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.expiration_time = 600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.310358] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.hashclient_retry_attempts = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.310525] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.hashclient_retry_delay = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.310688] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_dead_retry = 300 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.310853] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_password = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.311024] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.311200] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.311364] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_pool_maxsize = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.311524] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_pool_unused_timeout = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.311683] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_sasl_enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.311861] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_servers = ['localhost:11211'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.312037] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_socket_timeout = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.312211] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.memcache_username = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.312402] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.proxies = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.312572] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.retry_attempts = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.312740] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.retry_delay = 0.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.312901] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.socket_keepalive_count = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.313073] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.socket_keepalive_idle = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.313237] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.socket_keepalive_interval = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.313398] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.tls_allowed_ciphers = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.313554] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.tls_cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.313709] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.tls_certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.313870] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.tls_enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.314060] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cache.tls_keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.314213] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.auth_section = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.314391] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.auth_type = password {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.314556] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.314733] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.catalog_info = volumev3::publicURL {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.314896] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316225] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316225] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.cross_az_attach = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316225] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.debug = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316225] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.endpoint_template = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316225] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.http_retries = 3 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316225] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316435] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316435] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.os_region_name = RegionOne {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316435] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316517] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cinder.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316689] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.316852] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.cpu_dedicated_set = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.317015] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.cpu_shared_set = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.317187] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.image_type_exclude_list = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.317352] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.live_migration_wait_for_vif_plug = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.317514] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.max_concurrent_disk_ops = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.317674] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.max_disk_devices_to_attach = -1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.317835] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.318013] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.318181] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.resource_provider_association_refresh = 300 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.318365] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.shutdown_retry_interval = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.318559] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.318743] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] conductor.workers = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.318919] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] console.allowed_origins = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.319093] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] console.ssl_ciphers = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.319294] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] console.ssl_minimum_version = default {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.319473] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] consoleauth.token_ttl = 600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.319647] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.319855] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.320083] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.320282] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.connect_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.320451] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.connect_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.320630] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.endpoint_override = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.320948] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.321039] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.321846] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.max_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.321846] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.min_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.321846] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.region_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.321846] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.service_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.321846] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.service_type = accelerator {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.322019] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.322146] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.status_code_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.322329] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.status_code_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.322492] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.322715] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.322867] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] cyborg.version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.323070] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.backend = sqlalchemy {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.323257] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.connection = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.323432] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.connection_debug = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.323604] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.connection_parameters = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.323770] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.connection_recycle_time = 3600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.323940] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.connection_trace = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.324176] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.db_inc_retry_interval = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.324284] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.db_max_retries = 20 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.324450] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.db_max_retry_interval = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.324613] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.db_retry_interval = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.324784] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.max_overflow = 50 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.324947] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.max_pool_size = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.325129] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.max_retries = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.325302] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.mysql_sql_mode = TRADITIONAL {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.325462] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.mysql_wsrep_sync_wait = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.325625] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.pool_timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.325817] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.retry_interval = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.325986] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.slave_connection = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.326166] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.sqlite_synchronous = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.326332] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] database.use_db_reconnect = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.326513] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.backend = sqlalchemy {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.326688] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.connection = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.326858] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.connection_debug = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.327050] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.connection_parameters = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.327201] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.connection_recycle_time = 3600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.327368] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.connection_trace = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.327533] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.db_inc_retry_interval = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.327696] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.db_max_retries = 20 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.327858] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.db_max_retry_interval = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.328033] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.db_retry_interval = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.328233] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.max_overflow = 50 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.328413] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.max_pool_size = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.328584] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.max_retries = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.328772] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.328949] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.mysql_wsrep_sync_wait = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.329145] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.pool_timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.329340] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.retry_interval = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.329505] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.slave_connection = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.329671] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] api_database.sqlite_synchronous = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.329846] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] devices.enabled_mdev_types = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.330031] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.330202] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ephemeral_storage_encryption.enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.330376] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ephemeral_storage_encryption.key_size = 512 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.330548] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.api_servers = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.330713] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.330876] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.331056] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.331223] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.connect_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.331387] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.connect_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.331548] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.debug = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.331716] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.default_trusted_certificate_ids = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.331907] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.enable_certificate_validation = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.332086] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.enable_rbd_download = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.332252] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.endpoint_override = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.332423] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.332588] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.332749] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.max_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.332907] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.min_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.333079] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.num_retries = 3 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.333253] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.rbd_ceph_conf = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.333418] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.rbd_connect_timeout = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.333587] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.rbd_pool = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.333754] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.rbd_user = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.333913] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.region_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.334080] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.service_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.334286] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.service_type = image {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.334410] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.334569] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.status_code_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.334724] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.status_code_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.334903] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.335103] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.335273] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.verify_glance_signatures = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.335436] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] glance.version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.335600] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] guestfs.debug = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.335772] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.config_drive_cdrom = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.335934] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.config_drive_inject_password = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.336112] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.336281] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.enable_instance_metrics_collection = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.336445] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.enable_remotefx = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.336613] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.instances_path_share = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.336780] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.iscsi_initiator_list = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.336939] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.limit_cpu_features = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.337124] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.337309] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.337476] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.power_state_check_timeframe = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.337647] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.power_state_event_polling_interval = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.337818] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.337980] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.use_multipath_io = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.338160] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.volume_attach_retry_count = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.338325] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.volume_attach_retry_interval = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.338486] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.vswitch_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.338647] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.338854] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] mks.enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.339263] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.339466] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] image_cache.manager_interval = 2400 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.339638] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] image_cache.precache_concurrency = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.339810] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] image_cache.remove_unused_base_images = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.339981] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.340203] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.340404] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] image_cache.subdirectory_name = _base {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.340582] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.api_max_retries = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.340748] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.api_retry_interval = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.340912] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.auth_section = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.341101] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.auth_type = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.341268] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.341428] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.341595] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.341760] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.conductor_group = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.341918] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.connect_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.342087] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.connect_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.342250] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.endpoint_override = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.342413] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.342570] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.342729] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.max_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.342885] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.min_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.343057] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.peer_list = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.343229] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.region_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.343428] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.serial_console_state_timeout = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.343592] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.service_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.343765] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.service_type = baremetal {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.343928] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.344097] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.status_code_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.344265] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.status_code_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.344423] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.344600] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.344761] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ironic.version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.344946] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.345135] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] key_manager.fixed_key = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.345322] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.345515] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.barbican_api_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.345684] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.barbican_endpoint = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.345856] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.barbican_endpoint_type = public {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.346022] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.barbican_region_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.346186] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.346348] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.346509] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.346669] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.346825] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.346987] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.number_of_retries = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.347160] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.retry_delay = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.347324] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.send_service_user_token = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.347482] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.347636] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.347793] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.verify_ssl = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.347948] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican.verify_ssl_path = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.348139] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican_service_user.auth_section = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.348310] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican_service_user.auth_type = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.348468] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican_service_user.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.348624] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican_service_user.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.348786] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican_service_user.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.348945] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican_service_user.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.349144] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican_service_user.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.349318] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican_service_user.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.349479] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] barbican_service_user.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.349648] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.approle_role_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.349807] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.approle_secret_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.349965] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.350153] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.350333] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.350498] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.350661] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.350836] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.kv_mountpoint = secret {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.350997] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.kv_path = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.351176] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.kv_version = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.351340] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.namespace = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.351500] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.root_token_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.351663] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.351822] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.ssl_ca_crt_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.351980] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.352155] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.use_ssl = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.352332] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.352501] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.auth_section = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.352668] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.auth_type = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.352824] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.352982] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.353156] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.353316] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.connect_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.353475] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.connect_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.353631] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.endpoint_override = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.353791] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.353947] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.354116] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.max_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.354274] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.min_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.354433] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.region_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.354591] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.service_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.354758] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.service_type = identity {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.354918] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.355088] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.status_code_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.355246] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.status_code_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.355500] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.355747] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.355931] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] keystone.version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.356169] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.connection_uri = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.356341] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.cpu_mode = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.356520] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.cpu_model_extra_flags = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.356724] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.cpu_models = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.356907] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.cpu_power_governor_high = performance {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.357190] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.cpu_power_governor_low = powersave {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.357394] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.cpu_power_management = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.357609] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.357801] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.device_detach_attempts = 8 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.357973] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.device_detach_timeout = 20 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.358163] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.disk_cachemodes = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.358363] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.disk_prefix = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.358545] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.enabled_perf_events = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.358744] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.file_backed_memory = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.358964] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.gid_maps = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.359177] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.hw_disk_discard = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.359349] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.hw_machine_type = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.359524] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.images_rbd_ceph_conf = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.359691] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.359861] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.360038] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.images_rbd_glance_store_name = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.360213] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.images_rbd_pool = rbd {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.360385] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.images_type = default {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.360543] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.images_volume_group = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.360705] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.inject_key = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.360868] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.inject_partition = -2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.361039] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.inject_password = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.361205] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.iscsi_iface = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.361367] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.iser_use_multipath = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.361539] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_bandwidth = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.361698] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_completion_timeout = 800 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.361860] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_downtime = 500 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.362034] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_downtime_delay = 75 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.362209] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_downtime_steps = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.362375] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_inbound_addr = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.362544] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_permit_auto_converge = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.362706] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_permit_post_copy = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.362865] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_scheme = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.363048] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_timeout_action = abort {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.363216] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_tunnelled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.363377] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_uri = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.363542] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.live_migration_with_native_tls = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.363703] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.max_queues = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.363865] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.mem_stats_period_seconds = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.364033] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.nfs_mount_options = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.364368] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.364544] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.num_aoe_discover_tries = 3 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.364712] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.num_iser_scan_tries = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.364875] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.num_memory_encrypted_guests = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.365054] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.num_nvme_discover_tries = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.365224] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.num_pcie_ports = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.365393] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.num_volume_scan_tries = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.365559] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.pmem_namespaces = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.365719] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.quobyte_client_cfg = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.366025] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.366203] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rbd_connect_timeout = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.366372] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.366538] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.366700] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rbd_secret_uuid = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.366858] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rbd_user = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.367031] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.realtime_scheduler_priority = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.367208] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.remote_filesystem_transport = ssh {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.367369] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rescue_image_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.367530] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rescue_kernel_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.367689] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rescue_ramdisk_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.367860] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rng_dev_path = /dev/urandom {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.368028] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.rx_queue_size = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.368204] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.smbfs_mount_options = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.368479] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.368652] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.snapshot_compression = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.368817] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.snapshot_image_format = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.369048] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.369223] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.sparse_logical_volumes = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.369390] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.swtpm_enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.369561] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.swtpm_group = tss {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.369729] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.swtpm_user = tss {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.369900] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.sysinfo_serial = unique {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.370073] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.tb_cache_size = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.370236] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.tx_queue_size = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.370404] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.uid_maps = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.370567] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.use_virtio_for_bridges = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.370738] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.virt_type = kvm {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.370907] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.volume_clear = zero {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.371152] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.volume_clear_size = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.371331] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.volume_use_multipath = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.371494] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.vzstorage_cache_path = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.371664] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.371830] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.vzstorage_mount_group = qemu {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.371996] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.vzstorage_mount_opts = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.372177] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.372455] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.372632] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.vzstorage_mount_user = stack {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.372798] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.372972] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.auth_section = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.373162] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.auth_type = password {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.373327] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.373489] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.373652] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.373810] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.connect_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.373969] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.connect_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.374149] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.default_floating_pool = public {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.374310] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.endpoint_override = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.374473] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.extension_sync_interval = 600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.374634] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.http_retries = 3 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.374806] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.374952] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.375122] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.max_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.375295] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.metadata_proxy_shared_secret = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.375453] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.min_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.375622] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.ovs_bridge = br-int {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.375788] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.physnets = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.375958] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.region_name = RegionOne {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.376140] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.service_metadata_proxy = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.376303] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.service_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.376470] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.service_type = network {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.376632] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.376789] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.status_code_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.376946] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.status_code_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.377119] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.377298] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.377460] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] neutron.version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.377637] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] notifications.bdms_in_notifications = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.377815] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] notifications.default_level = INFO {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.377992] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] notifications.notification_format = unversioned {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.378186] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] notifications.notify_on_state_change = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.378369] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.378561] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] pci.alias = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.378732] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] pci.device_spec = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.378911] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] pci.report_in_placement = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.379104] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.auth_section = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.379281] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.auth_type = password {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.379453] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.auth_url = http://10.180.1.21/identity {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.379615] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.379777] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.379938] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.380108] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.connect_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.380272] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.connect_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.380432] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.default_domain_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.380591] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.default_domain_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.380748] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.domain_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.380906] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.domain_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.381072] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.endpoint_override = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.381239] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.381397] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.381565] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.max_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.381723] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.min_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.381892] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.password = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.382061] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.project_domain_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.382231] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.project_domain_name = Default {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.382399] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.project_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.382573] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.project_name = service {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.382740] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.region_name = RegionOne {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.382898] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.service_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.383074] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.service_type = placement {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.383241] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.383399] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.status_code_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.383563] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.status_code_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.383717] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.system_scope = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.383873] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.384042] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.trust_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.384204] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.user_domain_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.384376] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.user_domain_name = Default {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.384535] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.user_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.384707] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.username = placement {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.384919] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.385056] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] placement.version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.385240] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.cores = 20 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.385410] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.count_usage_from_placement = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.385583] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.385754] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.injected_file_content_bytes = 10240 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.385921] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.injected_file_path_length = 255 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.386100] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.injected_files = 5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.386269] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.instances = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.386437] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.key_pairs = 100 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.386602] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.metadata_items = 128 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.386765] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.ram = 51200 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.386927] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.recheck_quota = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.387107] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.server_group_members = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.387272] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] quota.server_groups = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.387444] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] rdp.enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.387760] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.387945] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.388128] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.388298] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.image_metadata_prefilter = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.388462] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.388628] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.max_attempts = 3 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.388793] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.max_placement_results = 1000 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.388958] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.389147] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.query_placement_for_image_type_support = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.389303] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.389478] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] scheduler.workers = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.389652] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.389821] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.390009] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.390184] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.390355] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.390520] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.390684] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.390871] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.391051] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.host_subset_size = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.391228] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.391390] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.image_properties_default_architecture = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.391555] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.391720] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.isolated_hosts = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.391901] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.isolated_images = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.392075] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.max_instances_per_host = 50 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.392241] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.392419] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.392594] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.pci_in_placement = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.392759] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.392922] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.393120] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.393293] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.393460] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.393624] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.393788] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.track_instance_changes = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.393966] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.394150] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] metrics.required = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.394321] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] metrics.weight_multiplier = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.394485] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] metrics.weight_of_unavailable = -10000.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.394649] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] metrics.weight_setting = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.394939] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.395132] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] serial_console.enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.395310] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] serial_console.port_range = 10000:20000 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.395481] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.395651] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.395819] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] serial_console.serialproxy_port = 6083 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.395988] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.auth_section = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.396176] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.auth_type = password {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.396339] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.396499] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.396660] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.396822] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.396980] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.397176] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.send_service_user_token = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.397343] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.397501] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] service_user.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.397670] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.agent_enabled = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.397833] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.398132] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.398327] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.html5proxy_host = 0.0.0.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.398495] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.html5proxy_port = 6082 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.398655] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.image_compression = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.398815] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.jpeg_compression = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.398973] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.playback_compression = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.399159] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.server_listen = 127.0.0.1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.399329] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.399488] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.streaming_mode = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.399646] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] spice.zlib_compression = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.399809] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] upgrade_levels.baseapi = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.399967] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] upgrade_levels.cert = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.400148] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] upgrade_levels.compute = auto {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.400311] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] upgrade_levels.conductor = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.400468] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] upgrade_levels.scheduler = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.400633] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vendordata_dynamic_auth.auth_section = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.400796] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vendordata_dynamic_auth.auth_type = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.400955] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vendordata_dynamic_auth.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.401125] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vendordata_dynamic_auth.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.401291] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vendordata_dynamic_auth.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.401453] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vendordata_dynamic_auth.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.401611] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vendordata_dynamic_auth.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.401771] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vendordata_dynamic_auth.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.401928] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vendordata_dynamic_auth.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.402122] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.api_retry_count = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.402289] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.ca_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.402492] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.cache_prefix = devstack-image-cache {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.402670] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.cluster_name = testcl1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.402840] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.connection_pool_size = 10 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.402999] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.console_delay_seconds = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.403183] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.datastore_regex = ^datastore.* {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.403390] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.403564] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.host_password = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.403730] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.host_port = 443 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.403899] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.host_username = administrator@vsphere.local {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.404076] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.insecure = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.404241] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.integration_bridge = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.404408] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.maximum_objects = 100 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.404566] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.pbm_default_policy = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.404726] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.pbm_enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.404883] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.pbm_wsdl_location = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.405059] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.405221] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.serial_port_proxy_uri = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.405380] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.serial_port_service_uri = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.405544] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.task_poll_interval = 0.5 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.405714] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.use_linked_clone = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.405881] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.vnc_keymap = en-us {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.406055] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.vnc_port = 5900 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.406223] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vmware.vnc_port_total = 10000 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.406671] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.auth_schemes = ['none'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.406671] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.406863] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.407058] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.407232] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.novncproxy_port = 6080 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.407412] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.server_listen = 127.0.0.1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.407583] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.407744] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.vencrypt_ca_certs = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.407905] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.vencrypt_client_cert = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.408073] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vnc.vencrypt_client_key = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.408250] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.408420] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.disable_deep_image_inspection = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.408582] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.disable_fallback_pcpu_query = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.408746] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.disable_group_policy_check_upcall = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.408908] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.409086] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.disable_rootwrap = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.409259] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.enable_numa_live_migration = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.409451] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.409618] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.409811] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.handle_virt_lifecycle_events = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.409978] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.libvirt_disable_apic = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.410155] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.never_download_image_if_on_rbd = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.410338] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.410523] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.410688] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.410852] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.411023] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.411190] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.411347] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.411511] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.411674] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.411859] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.412038] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.client_socket_timeout = 900 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.412209] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.default_pool_size = 1000 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.412377] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.keep_alive = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.412544] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.max_header_line = 16384 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.412704] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.secure_proxy_ssl_header = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.412864] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.ssl_ca_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.413031] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.ssl_cert_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.413193] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.ssl_key_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.413358] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.tcp_keepidle = 600 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.413530] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.413697] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] zvm.ca_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.413856] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] zvm.cloud_connector_url = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.414148] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.414324] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] zvm.reachable_timeout = 300 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.414502] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.enforce_new_defaults = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.414672] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.enforce_scope = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.414846] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.policy_default_rule = default {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.415035] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.415213] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.policy_file = policy.yaml {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.415390] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.415545] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.415702] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.415860] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.416030] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.416203] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.416380] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.416557] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.connection_string = messaging:// {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.416724] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.enabled = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.416892] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.es_doc_type = notification {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.417064] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.es_scroll_size = 10000 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.417235] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.es_scroll_time = 2m {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.417396] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.filter_error_trace = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.417564] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.hmac_keys = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.417730] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.sentinel_service_name = mymaster {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.417893] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.socket_timeout = 0.1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.418065] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.trace_requests = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.418225] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler.trace_sqlalchemy = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.418409] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler_jaeger.process_tags = {} {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.418570] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler_jaeger.service_name_prefix = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.418729] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] profiler_otlp.service_name_prefix = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.418891] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] remote_debug.host = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.419062] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] remote_debug.port = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.419246] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.419411] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.419572] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.419733] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.419891] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.420058] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.420222] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.420386] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.420546] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.420702] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.420870] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.421043] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.421215] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.421383] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.421546] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.421717] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.421879] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.422049] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.422216] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.422381] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.422540] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.422706] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.422865] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.423035] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.423206] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.423372] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.ssl = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.423541] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.423707] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.423867] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.424039] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.424210] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_rabbit.ssl_version = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.424394] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.424556] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_notifications.retry = -1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.424736] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.424907] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_messaging_notifications.transport_url = **** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.425087] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.auth_section = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.425255] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.auth_type = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.425418] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.cafile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.425593] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.certfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.425752] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.collect_timing = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.425909] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.connect_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.426079] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.connect_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.426239] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.endpoint_id = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.426398] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.endpoint_override = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.426555] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.insecure = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.426708] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.keyfile = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.426862] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.max_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.427024] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.min_version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.427182] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.region_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.427337] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.service_name = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.427489] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.service_type = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.427647] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.split_loggers = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.427803] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.status_code_retries = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.427957] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.status_code_retry_delay = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.428123] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.timeout = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.428282] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.valid_interfaces = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.428433] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_limit.version = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.428593] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_reports.file_event_handler = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.428754] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_reports.file_event_handler_interval = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.428911] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] oslo_reports.log_dir = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.429097] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.429276] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_linux_bridge_privileged.group = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.429439] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.429613] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.429774] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.429930] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_linux_bridge_privileged.user = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.430111] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.430272] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_ovs_privileged.group = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.430427] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_ovs_privileged.helper_command = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.430590] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.430748] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.430904] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] vif_plug_ovs_privileged.user = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.431081] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_linux_bridge.flat_interface = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.431269] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.431446] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.431614] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.431780] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.431940] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.432114] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.432279] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_linux_bridge.vlan_interface = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.432454] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.432621] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_ovs.isolate_vif = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.432784] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.432945] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.433122] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.433295] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_ovs.ovsdb_interface = native {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.433455] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_vif_ovs.per_port_bridge = False {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.433620] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_brick.lock_path = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.433782] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.433942] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] os_brick.wait_mpath_device_interval = 1 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.434126] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] privsep_osbrick.capabilities = [21] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.434289] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] privsep_osbrick.group = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.434444] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] privsep_osbrick.helper_command = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.434608] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.434772] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] privsep_osbrick.thread_pool_size = 8 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.434930] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] privsep_osbrick.user = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.435112] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.435274] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] nova_sys_admin.group = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.435429] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] nova_sys_admin.helper_command = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.435589] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.435748] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] nova_sys_admin.thread_pool_size = 8 {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.435901] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] nova_sys_admin.user = None {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 555.436039] env[67899]: DEBUG oslo_service.service [None req-a5fa3453-f543-4fee-b275-36d0bd54fe6f None None] ******************************************************************************** {{(pid=67899) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 555.436459] env[67899]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 555.445822] env[67899]: WARNING nova.virt.vmwareapi.driver [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 555.446267] env[67899]: INFO nova.virt.node [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Generated node identity fffa0b42-f65d-4394-a98c-0df038b9ed4b [ 555.446499] env[67899]: INFO nova.virt.node [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Wrote node identity fffa0b42-f65d-4394-a98c-0df038b9ed4b to /opt/stack/data/n-cpu-1/compute_id [ 555.459215] env[67899]: WARNING nova.compute.manager [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Compute nodes ['fffa0b42-f65d-4394-a98c-0df038b9ed4b'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 555.492563] env[67899]: INFO nova.compute.manager [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 555.515677] env[67899]: WARNING nova.compute.manager [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 555.515904] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 555.516194] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 555.516335] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 555.516510] env[67899]: DEBUG nova.compute.resource_tracker [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 555.518245] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d0e600c-2d12-4042-9663-ee6057eb4f9e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.527260] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c827d35-6eac-4d80-aa1d-fac04b3fe875 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.540858] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4310d126-3467-4acc-a06b-10d2da0865e5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.546868] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a0f8f7a-4226-4d99-8969-fe7f3a0f22d6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.579359] env[67899]: DEBUG nova.compute.resource_tracker [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180920MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 555.579480] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 555.579659] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 555.591762] env[67899]: WARNING nova.compute.resource_tracker [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] No compute node record for cpu-1:fffa0b42-f65d-4394-a98c-0df038b9ed4b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host fffa0b42-f65d-4394-a98c-0df038b9ed4b could not be found. [ 555.605042] env[67899]: INFO nova.compute.resource_tracker [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: fffa0b42-f65d-4394-a98c-0df038b9ed4b [ 555.659080] env[67899]: DEBUG nova.compute.resource_tracker [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 555.659295] env[67899]: DEBUG nova.compute.resource_tracker [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 555.764620] env[67899]: INFO nova.scheduler.client.report [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] [req-b955b901-a136-4d77-9807-5a29738e86cb] Created resource provider record via placement API for resource provider with UUID fffa0b42-f65d-4394-a98c-0df038b9ed4b and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 555.781513] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84ac45b7-66b2-4da0-865b-a069bb5846cb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.788848] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af0243f1-e05b-48ef-882c-757da04a88fb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.819138] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1256bd7d-f62f-4ab7-9870-97ae389742ef {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.826163] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1448a64-b71e-4705-aa4b-0f1f45ca9048 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.838846] env[67899]: DEBUG nova.compute.provider_tree [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Updating inventory in ProviderTree for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 555.875738] env[67899]: DEBUG nova.scheduler.client.report [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Updated inventory for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 555.875974] env[67899]: DEBUG nova.compute.provider_tree [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Updating resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b generation from 0 to 1 during operation: update_inventory {{(pid=67899) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 555.876143] env[67899]: DEBUG nova.compute.provider_tree [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Updating inventory in ProviderTree for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 555.925404] env[67899]: DEBUG nova.compute.provider_tree [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Updating resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b generation from 1 to 2 during operation: update_traits {{(pid=67899) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 555.943455] env[67899]: DEBUG nova.compute.resource_tracker [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 555.943655] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.364s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 555.943818] env[67899]: DEBUG nova.service [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Creating RPC server for service compute {{(pid=67899) start /opt/stack/nova/nova/service.py:182}} [ 555.956668] env[67899]: DEBUG nova.service [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] Join ServiceGroup membership for this service compute {{(pid=67899) start /opt/stack/nova/nova/service.py:199}} [ 555.956771] env[67899]: DEBUG nova.servicegroup.drivers.db [None req-c681777f-82b0-45c8-8182-7e1cdab1b9ab None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=67899) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 565.273056] env[67899]: DEBUG dbcounter [-] [67899] Writing DB stats nova_cell1:SELECT=1 {{(pid=67899) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 565.274373] env[67899]: DEBUG dbcounter [-] [67899] Writing DB stats nova_cell0:SELECT=1 {{(pid=67899) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 574.959195] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_power_states {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.970012] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Getting list of instances from cluster (obj){ [ 574.970012] env[67899]: value = "domain-c8" [ 574.970012] env[67899]: _type = "ClusterComputeResource" [ 574.970012] env[67899]: } {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 574.971148] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c9a202f-1631-41fa-af89-b243fd5cfeac {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.980253] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Got total of 0 instances {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 574.980475] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 574.980962] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Getting list of instances from cluster (obj){ [ 574.980962] env[67899]: value = "domain-c8" [ 574.980962] env[67899]: _type = "ClusterComputeResource" [ 574.980962] env[67899]: } {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 574.981691] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cc0e191-f022-4878-a3cd-e69ebfee365d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.989402] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Got total of 0 instances {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 600.663155] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquiring lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 600.663155] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 600.699868] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 600.848723] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 600.848971] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 600.850671] env[67899]: INFO nova.compute.claims [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 600.940527] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquiring lock "6b16f08c-a470-4a9b-8096-05cec2e960cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 600.940783] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Lock "6b16f08c-a470-4a9b-8096-05cec2e960cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 600.962382] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 601.034139] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0c4260d-b531-427e-b581-01f4c90e041d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.045053] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df9f9a0d-c1c2-403f-b022-912c042d8f0b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.086096] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.087993] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0249fbad-6435-4d9f-a971-166c84c421cb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.096710] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f44d6e38-1890-4e9a-be07-1e4e0d7ba9be {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.114383] env[67899]: DEBUG nova.compute.provider_tree [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 601.134110] env[67899]: DEBUG nova.scheduler.client.report [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 601.158916] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 601.159760] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 601.161957] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.076s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.163734] env[67899]: INFO nova.compute.claims [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 601.239112] env[67899]: DEBUG nova.compute.utils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 601.242064] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 601.242341] env[67899]: DEBUG nova.network.neutron [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 601.254852] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 601.331581] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4200c419-12f8-4f82-b207-03dc3df1113a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.347243] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e681138-381a-442d-8410-6a232f64b47e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.351075] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 601.383693] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d984d49f-a5be-4b16-80b7-3787bde00c22 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.394261] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78970ed1-b9f9-4e19-b380-ad00def2e17e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.413693] env[67899]: DEBUG nova.compute.provider_tree [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 601.425450] env[67899]: DEBUG nova.scheduler.client.report [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 601.456519] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 601.457133] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 601.512494] env[67899]: DEBUG nova.compute.utils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 601.513426] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 601.514174] env[67899]: DEBUG nova.network.neutron [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 601.531909] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 601.607753] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 601.679049] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquiring lock "267a1016-410e-4097-9523-6fcafc5f4eb0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.679386] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Lock "267a1016-410e-4097-9523-6fcafc5f4eb0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.693403] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 601.754145] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.754145] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.754883] env[67899]: INFO nova.compute.claims [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 601.888704] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c076bd51-7813-4432-b102-43f3e2e2290b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.898503] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a74ecc39-01be-4994-9501-236348062f0a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.934297] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1226e783-c75b-4ad5-81eb-74e777df624e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.939841] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquiring lock "7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.939841] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Lock "7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.946415] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a85bda36-1544-4897-878c-7c80a82cfe35 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.963046] env[67899]: DEBUG nova.compute.provider_tree [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 601.964930] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 601.979610] env[67899]: DEBUG nova.scheduler.client.report [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 601.995181] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 601.995181] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 601.995181] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 601.995339] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 601.995339] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 601.995339] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 601.995457] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 601.995754] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 601.996864] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 601.996864] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 601.996864] env[67899]: DEBUG nova.virt.hardware [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 601.999473] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c2664d4-d057-4741-bf1d-e14f46eca5b8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.010118] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 602.010118] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 602.014074] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7511e2ec-2848-4fc9-8fb6-82e96c09f9f7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.020456] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 602.020696] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 602.020845] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 602.021038] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 602.021194] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 602.021339] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 602.021589] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 602.021760] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 602.023857] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 602.023857] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 602.023857] env[67899]: DEBUG nova.virt.hardware [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 602.023857] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ad6a628-5b48-40f2-b46e-03cec0c61412 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.044023] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c78c3f2f-3d92-4ca6-8d91-9f5049b6c671 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.061050] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a340173-1839-4b2d-ac0f-823333cad440 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.086423] env[67899]: DEBUG nova.compute.utils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 602.087892] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 602.087892] env[67899]: DEBUG nova.network.neutron [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 602.093355] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.093355] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 602.095252] env[67899]: INFO nova.compute.claims [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 602.114636] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 602.202677] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 602.246463] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 602.246715] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 602.262317] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 602.262317] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 602.262317] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 602.262317] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 602.262317] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 602.262819] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 602.262819] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 602.262819] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 602.262819] env[67899]: DEBUG nova.virt.hardware [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 602.262819] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edaa8de7-8f73-4afb-a749-0c4d6e9bcd11 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.262983] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-741ba5f6-19de-456e-9b1f-b69334470a8e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.312915] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1acd5284-cfb7-4ff7-b311-0ac83e8610e4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.324339] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecc85f5c-3bba-44b0-b80d-371589f5c7b8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.368292] env[67899]: DEBUG nova.policy [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74f219f6fb6e414cb8121d16b728d31c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a580c8dda76f4a63a2b5760345811296', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 602.370540] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d5f5471-eb78-40b8-a172-8aeaf88e27c2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.378847] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d1ca1e3-7b7f-47d1-b17c-04760eda2d20 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.391987] env[67899]: DEBUG nova.compute.provider_tree [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 602.405326] env[67899]: DEBUG nova.scheduler.client.report [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 602.426089] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.333s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 602.426702] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 602.465701] env[67899]: DEBUG nova.policy [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de7ced0dc8dc49efa1b4387bd1e0c75b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '18d38113058c46019696931adc72474f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 602.479226] env[67899]: DEBUG nova.compute.utils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 602.481214] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 602.481214] env[67899]: DEBUG nova.network.neutron [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 602.504160] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 602.510333] env[67899]: DEBUG nova.policy [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '68ba26b8886b4f8283865ef8d1a3c689', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61c3040dc8fd4d5d924946d6997e4c34', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 602.605686] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 602.644168] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 602.647597] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 602.648210] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 602.648623] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 602.650255] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 602.650255] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 602.650255] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 602.650255] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 602.650255] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 602.650613] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 602.650613] env[67899]: DEBUG nova.virt.hardware [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 602.652028] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0a9618b-103e-40fc-bfda-71c2832a4485 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.668993] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4456e6a-1362-49bd-ba34-13c7f49da355 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.781166] env[67899]: DEBUG nova.policy [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e8c08a28a9c43f392b391654bc48efd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c0cffa0c599a4e99aedb4d49b1dda58d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 602.997049] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "195a4a1e-3da7-4a69-a679-869346368195" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.997238] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "195a4a1e-3da7-4a69-a679-869346368195" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.023129] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 603.117068] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 603.117068] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.119618] env[67899]: INFO nova.compute.claims [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 603.335497] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-484bffdf-6bfb-4d22-9510-b5deef5d6e7d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.348020] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9b76829-9e2a-4405-8e15-7f914d34af88 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.379902] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3894638-9dc3-40bf-9f6b-10e3fa9f1a1d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.388783] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6af3ca98-00e0-43ae-a929-0839cd243c86 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.404643] env[67899]: DEBUG nova.compute.provider_tree [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 603.420230] env[67899]: DEBUG nova.scheduler.client.report [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 603.438464] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.321s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 603.438782] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 603.511417] env[67899]: DEBUG nova.compute.utils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 603.512928] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 603.513124] env[67899]: DEBUG nova.network.neutron [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 603.533075] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 603.631687] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 603.669026] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 603.670138] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 603.670138] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 603.670138] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 603.670138] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 603.670138] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 603.670403] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 603.670403] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 603.670403] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 603.670489] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 603.670638] env[67899]: DEBUG nova.virt.hardware [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 603.672609] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d4678cf-9c6d-4046-8632-8c2c832ffcae {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.680993] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b82bc82-42ae-4fde-ae08-a98992be47c7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.784761] env[67899]: DEBUG nova.policy [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8f6e68af5f7147f9a8080d720a834a56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a6ddbe6f15c6436197b1b073170d78cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 603.951878] env[67899]: DEBUG nova.network.neutron [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Successfully created port: 7635b493-83e2-4b87-8a77-98a2c22b1996 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 604.490704] env[67899]: DEBUG nova.network.neutron [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Successfully created port: 1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 604.623473] env[67899]: DEBUG nova.network.neutron [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Successfully created port: 99943a77-7035-4bb2-9dfe-6103d3652711 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 605.307483] env[67899]: DEBUG nova.network.neutron [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Successfully created port: 5ff265f3-c9aa-4197-a03b-dfc6233cd032 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 606.018469] env[67899]: DEBUG nova.network.neutron [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Successfully created port: a2679f4f-910e-4df6-a70d-1f2d27ac9c89 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 608.227370] env[67899]: DEBUG nova.network.neutron [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Successfully updated port: 7635b493-83e2-4b87-8a77-98a2c22b1996 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 608.250321] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquiring lock "refresh_cache-6b16f08c-a470-4a9b-8096-05cec2e960cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 608.250496] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquired lock "refresh_cache-6b16f08c-a470-4a9b-8096-05cec2e960cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 608.252800] env[67899]: DEBUG nova.network.neutron [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 608.439026] env[67899]: DEBUG nova.network.neutron [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 608.843913] env[67899]: DEBUG nova.network.neutron [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Successfully updated port: 5ff265f3-c9aa-4197-a03b-dfc6233cd032 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 608.859017] env[67899]: DEBUG nova.network.neutron [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Successfully updated port: 1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 608.862639] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquiring lock "refresh_cache-7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 608.862785] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquired lock "refresh_cache-7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 608.862936] env[67899]: DEBUG nova.network.neutron [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 608.868298] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquiring lock "refresh_cache-267a1016-410e-4097-9523-6fcafc5f4eb0" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 608.868442] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquired lock "refresh_cache-267a1016-410e-4097-9523-6fcafc5f4eb0" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 608.868586] env[67899]: DEBUG nova.network.neutron [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 608.958337] env[67899]: DEBUG nova.network.neutron [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Successfully updated port: 99943a77-7035-4bb2-9dfe-6103d3652711 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 608.975851] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquiring lock "refresh_cache-91d5024f-9eac-4a56-b08f-c0f6a7eda775" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 608.977941] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquired lock "refresh_cache-91d5024f-9eac-4a56-b08f-c0f6a7eda775" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 608.977941] env[67899]: DEBUG nova.network.neutron [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 608.996873] env[67899]: DEBUG nova.network.neutron [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 609.087263] env[67899]: DEBUG nova.network.neutron [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 609.190071] env[67899]: DEBUG nova.network.neutron [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 609.262031] env[67899]: DEBUG nova.network.neutron [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Successfully updated port: a2679f4f-910e-4df6-a70d-1f2d27ac9c89 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 609.279792] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "refresh_cache-195a4a1e-3da7-4a69-a679-869346368195" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 609.279792] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "refresh_cache-195a4a1e-3da7-4a69-a679-869346368195" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 609.279792] env[67899]: DEBUG nova.network.neutron [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 609.544053] env[67899]: DEBUG nova.compute.manager [req-0b804b53-a068-43c4-8a81-770953e8da8c req-7406f095-5ce0-4988-9592-3f5c323c7005 service nova] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Received event network-vif-plugged-7635b493-83e2-4b87-8a77-98a2c22b1996 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 609.544053] env[67899]: DEBUG oslo_concurrency.lockutils [req-0b804b53-a068-43c4-8a81-770953e8da8c req-7406f095-5ce0-4988-9592-3f5c323c7005 service nova] Acquiring lock "6b16f08c-a470-4a9b-8096-05cec2e960cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.544053] env[67899]: DEBUG oslo_concurrency.lockutils [req-0b804b53-a068-43c4-8a81-770953e8da8c req-7406f095-5ce0-4988-9592-3f5c323c7005 service nova] Lock "6b16f08c-a470-4a9b-8096-05cec2e960cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.544053] env[67899]: DEBUG oslo_concurrency.lockutils [req-0b804b53-a068-43c4-8a81-770953e8da8c req-7406f095-5ce0-4988-9592-3f5c323c7005 service nova] Lock "6b16f08c-a470-4a9b-8096-05cec2e960cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 609.544278] env[67899]: DEBUG nova.compute.manager [req-0b804b53-a068-43c4-8a81-770953e8da8c req-7406f095-5ce0-4988-9592-3f5c323c7005 service nova] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] No waiting events found dispatching network-vif-plugged-7635b493-83e2-4b87-8a77-98a2c22b1996 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 609.544278] env[67899]: WARNING nova.compute.manager [req-0b804b53-a068-43c4-8a81-770953e8da8c req-7406f095-5ce0-4988-9592-3f5c323c7005 service nova] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Received unexpected event network-vif-plugged-7635b493-83e2-4b87-8a77-98a2c22b1996 for instance with vm_state building and task_state spawning. [ 609.655043] env[67899]: DEBUG nova.network.neutron [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 609.769806] env[67899]: DEBUG nova.network.neutron [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Updating instance_info_cache with network_info: [{"id": "7635b493-83e2-4b87-8a77-98a2c22b1996", "address": "fa:16:3e:46:d8:22", "network": {"id": "dbaf0cef-1bfc-4edc-b865-01ffba1b9eba", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-742213090-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a580c8dda76f4a63a2b5760345811296", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0746f464-a938-427b-ba02-600449df5070", "external-id": "nsx-vlan-transportzone-881", "segmentation_id": 881, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7635b493-83", "ovs_interfaceid": "7635b493-83e2-4b87-8a77-98a2c22b1996", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 609.789537] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Releasing lock "refresh_cache-6b16f08c-a470-4a9b-8096-05cec2e960cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 609.790335] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Instance network_info: |[{"id": "7635b493-83e2-4b87-8a77-98a2c22b1996", "address": "fa:16:3e:46:d8:22", "network": {"id": "dbaf0cef-1bfc-4edc-b865-01ffba1b9eba", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-742213090-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a580c8dda76f4a63a2b5760345811296", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0746f464-a938-427b-ba02-600449df5070", "external-id": "nsx-vlan-transportzone-881", "segmentation_id": 881, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7635b493-83", "ovs_interfaceid": "7635b493-83e2-4b87-8a77-98a2c22b1996", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 609.791146] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:46:d8:22', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0746f464-a938-427b-ba02-600449df5070', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7635b493-83e2-4b87-8a77-98a2c22b1996', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 609.805777] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.806613] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-393ff5b3-2479-4936-8e9e-d1fb4e70b5ff {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.821419] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Created folder: OpenStack in parent group-v4. [ 609.821900] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Creating folder: Project (a580c8dda76f4a63a2b5760345811296). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.822285] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4ab48cbc-dce3-4c51-b15e-557bdfb491c8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.833237] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Created folder: Project (a580c8dda76f4a63a2b5760345811296) in parent group-v692900. [ 609.833493] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Creating folder: Instances. Parent ref: group-v692901. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.833741] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-94538c70-5c0b-4677-9dce-7b79c9917b04 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.843618] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Created folder: Instances in parent group-v692901. [ 609.843912] env[67899]: DEBUG oslo.service.loopingcall [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 609.844557] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 609.844557] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6e0fd636-8c2d-4792-ba95-8298a2b25640 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.868796] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 609.868796] env[67899]: value = "task-3467815" [ 609.868796] env[67899]: _type = "Task" [ 609.868796] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 609.877454] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467815, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 609.931585] env[67899]: DEBUG nova.network.neutron [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Updating instance_info_cache with network_info: [{"id": "1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd", "address": "fa:16:3e:f4:b7:c8", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1a746cdf-ff", "ovs_interfaceid": "1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 609.945932] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Releasing lock "refresh_cache-267a1016-410e-4097-9523-6fcafc5f4eb0" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 609.946257] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Instance network_info: |[{"id": "1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd", "address": "fa:16:3e:f4:b7:c8", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1a746cdf-ff", "ovs_interfaceid": "1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 609.946634] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f4:b7:c8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 609.958046] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Creating folder: Project (18d38113058c46019696931adc72474f). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.961212] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8030075e-baa8-4359-883c-0c71df4ae726 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.971800] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Created folder: Project (18d38113058c46019696931adc72474f) in parent group-v692900. [ 609.972017] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Creating folder: Instances. Parent ref: group-v692904. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.972253] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-79afd29a-9e5f-4086-9c59-f9ae8e7664b4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.981718] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Created folder: Instances in parent group-v692904. [ 609.981963] env[67899]: DEBUG oslo.service.loopingcall [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 609.982202] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 609.982365] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cacafe02-d6ed-49f7-b04b-ef353d948e8b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.007376] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 610.007376] env[67899]: value = "task-3467818" [ 610.007376] env[67899]: _type = "Task" [ 610.007376] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 610.016179] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467818, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 610.123295] env[67899]: DEBUG nova.network.neutron [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Updating instance_info_cache with network_info: [{"id": "5ff265f3-c9aa-4197-a03b-dfc6233cd032", "address": "fa:16:3e:da:9f:ad", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ff265f3-c9", "ovs_interfaceid": "5ff265f3-c9aa-4197-a03b-dfc6233cd032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 610.137909] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Releasing lock "refresh_cache-7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 610.138352] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Instance network_info: |[{"id": "5ff265f3-c9aa-4197-a03b-dfc6233cd032", "address": "fa:16:3e:da:9f:ad", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ff265f3-c9", "ovs_interfaceid": "5ff265f3-c9aa-4197-a03b-dfc6233cd032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 610.139045] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:da:9f:ad', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5ff265f3-c9aa-4197-a03b-dfc6233cd032', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 610.147913] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Creating folder: Project (c0cffa0c599a4e99aedb4d49b1dda58d). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 610.149050] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d7a23c04-146e-4e63-b38a-17b0db346f59 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.161975] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Created folder: Project (c0cffa0c599a4e99aedb4d49b1dda58d) in parent group-v692900. [ 610.162183] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Creating folder: Instances. Parent ref: group-v692907. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 610.163135] env[67899]: DEBUG nova.network.neutron [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Updating instance_info_cache with network_info: [{"id": "99943a77-7035-4bb2-9dfe-6103d3652711", "address": "fa:16:3e:a7:52:d0", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.245", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99943a77-70", "ovs_interfaceid": "99943a77-7035-4bb2-9dfe-6103d3652711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 610.165101] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0360435b-87f6-4172-b533-c771668c8162 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.176965] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Created folder: Instances in parent group-v692907. [ 610.177406] env[67899]: DEBUG oslo.service.loopingcall [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 610.177659] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 610.177659] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e63f8ad4-6dd0-4bd8-af9f-7fad01a8fbce {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.193814] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Releasing lock "refresh_cache-91d5024f-9eac-4a56-b08f-c0f6a7eda775" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 610.194125] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Instance network_info: |[{"id": "99943a77-7035-4bb2-9dfe-6103d3652711", "address": "fa:16:3e:a7:52:d0", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.245", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99943a77-70", "ovs_interfaceid": "99943a77-7035-4bb2-9dfe-6103d3652711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 610.194924] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a7:52:d0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '99943a77-7035-4bb2-9dfe-6103d3652711', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 610.203807] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Creating folder: Project (61c3040dc8fd4d5d924946d6997e4c34). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 610.205748] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-982e7617-e12b-4a34-bb82-055bfdb88947 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.207558] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 610.207558] env[67899]: value = "task-3467821" [ 610.207558] env[67899]: _type = "Task" [ 610.207558] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 610.219942] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Created folder: Project (61c3040dc8fd4d5d924946d6997e4c34) in parent group-v692900. [ 610.220192] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Creating folder: Instances. Parent ref: group-v692909. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 610.220442] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f4b90214-8830-4812-9dd7-1df4f7f97532 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.229321] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467821, 'name': CreateVM_Task} progress is 10%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 610.231269] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Created folder: Instances in parent group-v692909. [ 610.231562] env[67899]: DEBUG oslo.service.loopingcall [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 610.231768] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 610.231970] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d58f87cf-9061-45b2-952b-50702e5d7215 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.254863] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 610.254863] env[67899]: value = "task-3467824" [ 610.254863] env[67899]: _type = "Task" [ 610.254863] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 610.270137] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467824, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 610.379746] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467815, 'name': CreateVM_Task, 'duration_secs': 0.293877} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 610.379940] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 610.410659] env[67899]: DEBUG oslo_vmware.service [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a7062b3-f011-4481-82ad-4822b315fd63 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.417379] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 610.417547] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 610.418311] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 610.418604] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b5f8648-b097-4f6b-9aae-35ce1d88b477 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.423896] env[67899]: DEBUG oslo_vmware.api [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Waiting for the task: (returnval){ [ 610.423896] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52f984ff-3786-3558-b422-ee0721ded901" [ 610.423896] env[67899]: _type = "Task" [ 610.423896] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 610.431647] env[67899]: DEBUG oslo_vmware.api [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52f984ff-3786-3558-b422-ee0721ded901, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 610.522056] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467818, 'name': CreateVM_Task, 'duration_secs': 0.331893} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 610.522056] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 610.522056] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 610.616121] env[67899]: DEBUG nova.network.neutron [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Updating instance_info_cache with network_info: [{"id": "a2679f4f-910e-4df6-a70d-1f2d27ac9c89", "address": "fa:16:3e:ec:49:f1", "network": {"id": "857be8e0-b3fa-4836-87d8-37b0af1d0354", "bridge": "br-int", "label": "tempest-ImagesTestJSON-566779850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a6ddbe6f15c6436197b1b073170d78cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2679f4f-91", "ovs_interfaceid": "a2679f4f-910e-4df6-a70d-1f2d27ac9c89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 610.630571] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "refresh_cache-195a4a1e-3da7-4a69-a679-869346368195" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 610.630888] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Instance network_info: |[{"id": "a2679f4f-910e-4df6-a70d-1f2d27ac9c89", "address": "fa:16:3e:ec:49:f1", "network": {"id": "857be8e0-b3fa-4836-87d8-37b0af1d0354", "bridge": "br-int", "label": "tempest-ImagesTestJSON-566779850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a6ddbe6f15c6436197b1b073170d78cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2679f4f-91", "ovs_interfaceid": "a2679f4f-910e-4df6-a70d-1f2d27ac9c89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 610.631758] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ec:49:f1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '19598cc1-e105-4565-906a-09dde75e3fbe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a2679f4f-910e-4df6-a70d-1f2d27ac9c89', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 610.639369] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating folder: Project (a6ddbe6f15c6436197b1b073170d78cf). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 610.640395] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-298db8bb-ddc0-45c5-978f-5c65a7277e17 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.650069] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Created folder: Project (a6ddbe6f15c6436197b1b073170d78cf) in parent group-v692900. [ 610.650185] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating folder: Instances. Parent ref: group-v692913. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 610.650621] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-68ea2542-24db-4e32-a315-f6f2678dfad1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.660262] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Created folder: Instances in parent group-v692913. [ 610.660499] env[67899]: DEBUG oslo.service.loopingcall [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 610.660690] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 610.660893] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-40dd47a1-43fe-4d7a-b26a-ceccd1f53847 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.680199] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 610.680199] env[67899]: value = "task-3467827" [ 610.680199] env[67899]: _type = "Task" [ 610.680199] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 610.688244] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467827, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 610.727447] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467821, 'name': CreateVM_Task, 'duration_secs': 0.316379} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 610.727447] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 610.728314] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 610.769207] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467824, 'name': CreateVM_Task, 'duration_secs': 0.309954} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 610.769392] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 610.770159] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 610.882779] env[67899]: DEBUG nova.compute.manager [req-71f2598d-7f01-41d7-8a1f-bf787774ffba req-08009af4-e9e4-41fb-9aec-85d882dfee13 service nova] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Received event network-vif-plugged-5ff265f3-c9aa-4197-a03b-dfc6233cd032 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 610.882779] env[67899]: DEBUG oslo_concurrency.lockutils [req-71f2598d-7f01-41d7-8a1f-bf787774ffba req-08009af4-e9e4-41fb-9aec-85d882dfee13 service nova] Acquiring lock "7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 610.882993] env[67899]: DEBUG oslo_concurrency.lockutils [req-71f2598d-7f01-41d7-8a1f-bf787774ffba req-08009af4-e9e4-41fb-9aec-85d882dfee13 service nova] Lock "7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 610.883174] env[67899]: DEBUG oslo_concurrency.lockutils [req-71f2598d-7f01-41d7-8a1f-bf787774ffba req-08009af4-e9e4-41fb-9aec-85d882dfee13 service nova] Lock "7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 610.883329] env[67899]: DEBUG nova.compute.manager [req-71f2598d-7f01-41d7-8a1f-bf787774ffba req-08009af4-e9e4-41fb-9aec-85d882dfee13 service nova] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] No waiting events found dispatching network-vif-plugged-5ff265f3-c9aa-4197-a03b-dfc6233cd032 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 610.886643] env[67899]: WARNING nova.compute.manager [req-71f2598d-7f01-41d7-8a1f-bf787774ffba req-08009af4-e9e4-41fb-9aec-85d882dfee13 service nova] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Received unexpected event network-vif-plugged-5ff265f3-c9aa-4197-a03b-dfc6233cd032 for instance with vm_state building and task_state spawning. [ 610.935780] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 610.936132] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 610.936431] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 610.936582] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 610.937093] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 610.937400] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 610.937724] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 610.937964] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8a6ab1e5-df82-45c0-b677-ea2aa24b9f9b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.940371] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-691cfdc1-1ae6-49f6-ae8a-64d5e2d4cafc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.947416] env[67899]: DEBUG oslo_vmware.api [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Waiting for the task: (returnval){ [ 610.947416] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52f65c0e-73ec-dad6-b3c0-f131b28f9741" [ 610.947416] env[67899]: _type = "Task" [ 610.947416] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 610.952752] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 610.952905] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 610.954187] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6232b03-4d81-4e89-b30a-478ae5bd1546 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.962932] env[67899]: DEBUG oslo_vmware.api [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52f65c0e-73ec-dad6-b3c0-f131b28f9741, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 610.967817] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ded3e8b5-1337-4215-a126-19f54a825b69 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.973849] env[67899]: DEBUG oslo_vmware.api [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Waiting for the task: (returnval){ [ 610.973849] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5287b883-1a3b-797b-e812-6daf7a4e168a" [ 610.973849] env[67899]: _type = "Task" [ 610.973849] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 610.984796] env[67899]: DEBUG oslo_vmware.api [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5287b883-1a3b-797b-e812-6daf7a4e168a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 611.192559] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquiring lock "7a19bcfd-5544-4688-8edb-e12c567979ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 611.193089] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "7a19bcfd-5544-4688-8edb-e12c567979ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 611.203965] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467827, 'name': CreateVM_Task, 'duration_secs': 0.295991} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 611.204335] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 611.205566] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 611.206467] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 611.276557] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 611.276810] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 611.279047] env[67899]: INFO nova.compute.claims [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 611.472323] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 611.472323] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 611.472323] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 611.472323] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 611.472516] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 611.472516] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-85c20ee1-4ac5-4481-8758-699d4f01d937 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.496167] env[67899]: DEBUG oslo_vmware.api [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Waiting for the task: (returnval){ [ 611.496167] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]521e9042-6e9d-e814-40e6-bb090c131a6b" [ 611.496167] env[67899]: _type = "Task" [ 611.496167] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 611.496167] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 611.496167] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Creating directory with path [datastore1] vmware_temp/26a609a8-404f-4ca4-8454-a3606d468c8b/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 611.499492] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-beae6281-06e3-48fd-ba97-b9c895167f5d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.515068] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 611.515248] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 611.515619] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 611.515619] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 611.515887] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 611.516187] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b7280703-f65f-4110-8dff-fbcb35074aef {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.525484] env[67899]: DEBUG oslo_vmware.api [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Waiting for the task: (returnval){ [ 611.525484] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d29c73-ee1a-b8e8-27cb-5a4effe40134" [ 611.525484] env[67899]: _type = "Task" [ 611.525484] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 611.527285] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4891e856-d391-46e5-ba8c-deb420e1a135 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.537814] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Created directory with path [datastore1] vmware_temp/26a609a8-404f-4ca4-8454-a3606d468c8b/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 611.538073] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Fetch image to [datastore1] vmware_temp/26a609a8-404f-4ca4-8454-a3606d468c8b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 611.538353] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/26a609a8-404f-4ca4-8454-a3606d468c8b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 611.540572] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-164f21c8-f60a-487c-99d2-8a1a4be118dd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.549223] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-583be0e9-1f68-4caf-a4c8-7e5e8326c0fd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.562561] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9715b3f-2b5b-4449-a813-cac1a4f43435 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.595618] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 611.596524] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 611.596524] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 611.597135] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 611.597466] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 611.598382] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d3c116f-023c-4224-a562-c0d5b74b271c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.605396] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d49b1e43-121e-4dfb-8787-773529059e1f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.608572] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69102139-5fa4-4cc1-9585-8e06985c8c2c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.617309] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d4c45d9-2225-4d13-80e2-a43340a4200b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.624417] env[67899]: DEBUG oslo_vmware.api [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 611.624417] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]524f61eb-5426-7c4e-b668-6917c3f3f4a4" [ 611.624417] env[67899]: _type = "Task" [ 611.624417] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 611.652478] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4ac0758-3eb1-4b2d-b11c-4e4f9c7fc98f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.662725] env[67899]: DEBUG nova.compute.provider_tree [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 611.671275] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-28482d63-1fea-41c1-920f-f734ed477571 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.673623] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 611.673830] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 611.674041] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 611.675749] env[67899]: DEBUG nova.scheduler.client.report [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 611.703794] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.427s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 611.704340] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 611.710625] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 611.758126] env[67899]: DEBUG nova.compute.utils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 611.760931] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 611.761276] env[67899]: DEBUG nova.network.neutron [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 611.778501] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 611.801857] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/26a609a8-404f-4ca4-8454-a3606d468c8b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 611.876193] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 611.876377] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/26a609a8-404f-4ca4-8454-a3606d468c8b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 611.918880] env[67899]: DEBUG nova.policy [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e9700240bd4840a58cf8735f9fcaca32', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f54b1e55adf64b4c84219f033093c126', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 611.928645] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 611.953356] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 611.953594] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 611.953746] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 611.953919] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 611.954089] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 611.954245] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 611.954449] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 611.954599] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 611.954759] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 611.954913] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 611.955097] env[67899]: DEBUG nova.virt.hardware [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 611.955969] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-788af456-54d4-4467-b11a-713fa69f7dd2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 611.967076] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02e88f04-4bd7-48d8-b904-8f6a16627a79 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 612.007564] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 612.007846] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 612.008186] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 612.008253] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 612.034318] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 612.034448] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 612.034516] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 612.034656] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 612.034911] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 612.034911] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 612.036159] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 612.036751] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 612.037183] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 612.037540] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 612.037793] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 612.037961] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 612.038580] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 612.039701] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 612.039701] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 612.054579] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 612.054690] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 612.056024] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 612.056024] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 612.056888] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-164e3c09-a128-4dda-9869-8fd22c80fc0a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 612.070314] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80db0f80-7447-4061-818c-fdc96f697d01 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 612.089755] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19987f64-8c66-4a77-83c6-2f659acef1db {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 612.100030] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3e4fd7f-ac3c-42a9-ab6f-d6f66fadee2a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 612.140318] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180920MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 612.140318] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 612.142266] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 612.245730] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 91d5024f-9eac-4a56-b08f-c0f6a7eda775 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 612.246314] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 612.246642] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6b16f08c-a470-4a9b-8096-05cec2e960cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 612.246759] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 267a1016-410e-4097-9523-6fcafc5f4eb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 612.246878] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 195a4a1e-3da7-4a69-a679-869346368195 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 612.246993] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a19bcfd-5544-4688-8edb-e12c567979ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 612.247201] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 612.247344] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 612.395618] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ef0d7f6-9e6b-4428-9de2-62a3f3192ea1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 612.405914] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-991c5c54-8359-4fdd-902c-2ccfb0f04d4f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 612.439867] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c46f5319-1582-44dc-bfb8-01e175774845 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 612.447630] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-786fa7ad-303e-439f-9699-f09b4e7e8628 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 612.462205] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 612.470503] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 612.511150] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 612.511383] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.371s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 613.170484] env[67899]: DEBUG nova.network.neutron [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Successfully created port: e401ded8-39ef-421e-8ea4-d3c958ad9f58 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 613.277303] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Received event network-vif-plugged-1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 613.277573] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Acquiring lock "267a1016-410e-4097-9523-6fcafc5f4eb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 613.277838] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Lock "267a1016-410e-4097-9523-6fcafc5f4eb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.280778] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Lock "267a1016-410e-4097-9523-6fcafc5f4eb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 613.280778] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] No waiting events found dispatching network-vif-plugged-1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 613.280778] env[67899]: WARNING nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Received unexpected event network-vif-plugged-1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd for instance with vm_state building and task_state spawning. [ 613.280778] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Received event network-vif-plugged-99943a77-7035-4bb2-9dfe-6103d3652711 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 613.281315] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Acquiring lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 613.281315] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.281315] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 613.281315] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] No waiting events found dispatching network-vif-plugged-99943a77-7035-4bb2-9dfe-6103d3652711 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 613.281465] env[67899]: WARNING nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Received unexpected event network-vif-plugged-99943a77-7035-4bb2-9dfe-6103d3652711 for instance with vm_state building and task_state spawning. [ 613.281465] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Received event network-changed-7635b493-83e2-4b87-8a77-98a2c22b1996 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 613.281465] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Refreshing instance network info cache due to event network-changed-7635b493-83e2-4b87-8a77-98a2c22b1996. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 613.281465] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Acquiring lock "refresh_cache-6b16f08c-a470-4a9b-8096-05cec2e960cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 613.281465] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Acquired lock "refresh_cache-6b16f08c-a470-4a9b-8096-05cec2e960cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 613.283468] env[67899]: DEBUG nova.network.neutron [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Refreshing network info cache for port 7635b493-83e2-4b87-8a77-98a2c22b1996 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 614.541732] env[67899]: DEBUG nova.compute.manager [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Received event network-vif-plugged-a2679f4f-910e-4df6-a70d-1f2d27ac9c89 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 614.542837] env[67899]: DEBUG oslo_concurrency.lockutils [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] Acquiring lock "195a4a1e-3da7-4a69-a679-869346368195-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 614.543182] env[67899]: DEBUG oslo_concurrency.lockutils [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] Lock "195a4a1e-3da7-4a69-a679-869346368195-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 614.543397] env[67899]: DEBUG oslo_concurrency.lockutils [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] Lock "195a4a1e-3da7-4a69-a679-869346368195-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 614.543668] env[67899]: DEBUG nova.compute.manager [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 195a4a1e-3da7-4a69-a679-869346368195] No waiting events found dispatching network-vif-plugged-a2679f4f-910e-4df6-a70d-1f2d27ac9c89 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 614.543881] env[67899]: WARNING nova.compute.manager [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Received unexpected event network-vif-plugged-a2679f4f-910e-4df6-a70d-1f2d27ac9c89 for instance with vm_state building and task_state spawning. [ 614.544106] env[67899]: DEBUG nova.compute.manager [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Received event network-changed-5ff265f3-c9aa-4197-a03b-dfc6233cd032 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 614.544337] env[67899]: DEBUG nova.compute.manager [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Refreshing instance network info cache due to event network-changed-5ff265f3-c9aa-4197-a03b-dfc6233cd032. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 614.544542] env[67899]: DEBUG oslo_concurrency.lockutils [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] Acquiring lock "refresh_cache-7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 614.544718] env[67899]: DEBUG oslo_concurrency.lockutils [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] Acquired lock "refresh_cache-7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 614.544912] env[67899]: DEBUG nova.network.neutron [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Refreshing network info cache for port 5ff265f3-c9aa-4197-a03b-dfc6233cd032 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 614.814897] env[67899]: DEBUG nova.network.neutron [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Successfully updated port: e401ded8-39ef-421e-8ea4-d3c958ad9f58 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 614.830971] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquiring lock "refresh_cache-7a19bcfd-5544-4688-8edb-e12c567979ae" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 614.831229] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquired lock "refresh_cache-7a19bcfd-5544-4688-8edb-e12c567979ae" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 614.831429] env[67899]: DEBUG nova.network.neutron [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 614.878028] env[67899]: DEBUG nova.network.neutron [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 614.988428] env[67899]: DEBUG nova.network.neutron [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Updated VIF entry in instance network info cache for port 7635b493-83e2-4b87-8a77-98a2c22b1996. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 614.988812] env[67899]: DEBUG nova.network.neutron [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Updating instance_info_cache with network_info: [{"id": "7635b493-83e2-4b87-8a77-98a2c22b1996", "address": "fa:16:3e:46:d8:22", "network": {"id": "dbaf0cef-1bfc-4edc-b865-01ffba1b9eba", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-742213090-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a580c8dda76f4a63a2b5760345811296", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0746f464-a938-427b-ba02-600449df5070", "external-id": "nsx-vlan-transportzone-881", "segmentation_id": 881, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7635b493-83", "ovs_interfaceid": "7635b493-83e2-4b87-8a77-98a2c22b1996", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.999641] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Releasing lock "refresh_cache-6b16f08c-a470-4a9b-8096-05cec2e960cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 614.999910] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Received event network-changed-1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 615.000309] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Refreshing instance network info cache due to event network-changed-1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 615.000309] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Acquiring lock "refresh_cache-267a1016-410e-4097-9523-6fcafc5f4eb0" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 615.000453] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Acquired lock "refresh_cache-267a1016-410e-4097-9523-6fcafc5f4eb0" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 615.000608] env[67899]: DEBUG nova.network.neutron [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Refreshing network info cache for port 1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 615.155800] env[67899]: DEBUG nova.network.neutron [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Updating instance_info_cache with network_info: [{"id": "e401ded8-39ef-421e-8ea4-d3c958ad9f58", "address": "fa:16:3e:57:a4:3f", "network": {"id": "ee680f55-4f5e-4b19-8b86-905c0110eca6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-626501513-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f54b1e55adf64b4c84219f033093c126", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee9ce73d-4ee8-4b28-b7d3-3a5735039627", "external-id": "cl2-zone-465", "segmentation_id": 465, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape401ded8-39", "ovs_interfaceid": "e401ded8-39ef-421e-8ea4-d3c958ad9f58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.177816] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Releasing lock "refresh_cache-7a19bcfd-5544-4688-8edb-e12c567979ae" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 615.178436] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Instance network_info: |[{"id": "e401ded8-39ef-421e-8ea4-d3c958ad9f58", "address": "fa:16:3e:57:a4:3f", "network": {"id": "ee680f55-4f5e-4b19-8b86-905c0110eca6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-626501513-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f54b1e55adf64b4c84219f033093c126", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee9ce73d-4ee8-4b28-b7d3-3a5735039627", "external-id": "cl2-zone-465", "segmentation_id": 465, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape401ded8-39", "ovs_interfaceid": "e401ded8-39ef-421e-8ea4-d3c958ad9f58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 615.184030] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:57:a4:3f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ee9ce73d-4ee8-4b28-b7d3-3a5735039627', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e401ded8-39ef-421e-8ea4-d3c958ad9f58', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 615.194751] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Creating folder: Project (f54b1e55adf64b4c84219f033093c126). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.195880] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9d4a0dfd-4770-4662-8f19-dbf500bca052 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.217846] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Created folder: Project (f54b1e55adf64b4c84219f033093c126) in parent group-v692900. [ 615.217846] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Creating folder: Instances. Parent ref: group-v692916. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.217846] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-90aae811-0eff-45a2-9136-c0ffc972db2a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.227041] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Created folder: Instances in parent group-v692916. [ 615.227128] env[67899]: DEBUG oslo.service.loopingcall [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 615.227292] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 615.227573] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-54ef7c88-9ce6-4920-b82d-54b1bb20ceb8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.250759] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 615.250759] env[67899]: value = "task-3467830" [ 615.250759] env[67899]: _type = "Task" [ 615.250759] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 615.259506] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467830, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 615.768926] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467830, 'name': CreateVM_Task, 'duration_secs': 0.323145} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 615.769385] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 615.769996] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 615.772068] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 615.772068] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 615.772068] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-de065f9c-5014-4f59-b184-4426834770df {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.777846] env[67899]: DEBUG oslo_vmware.api [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Waiting for the task: (returnval){ [ 615.777846] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5253da22-3e4c-dd2a-28dd-c70c0d08640a" [ 615.777846] env[67899]: _type = "Task" [ 615.777846] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 615.788873] env[67899]: DEBUG oslo_vmware.api [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5253da22-3e4c-dd2a-28dd-c70c0d08640a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 616.296840] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 616.296840] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 616.296840] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 616.349168] env[67899]: DEBUG nova.network.neutron [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Updated VIF entry in instance network info cache for port 5ff265f3-c9aa-4197-a03b-dfc6233cd032. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 616.349507] env[67899]: DEBUG nova.network.neutron [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Updating instance_info_cache with network_info: [{"id": "5ff265f3-c9aa-4197-a03b-dfc6233cd032", "address": "fa:16:3e:da:9f:ad", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ff265f3-c9", "ovs_interfaceid": "5ff265f3-c9aa-4197-a03b-dfc6233cd032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 616.361787] env[67899]: DEBUG oslo_concurrency.lockutils [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] Releasing lock "refresh_cache-7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 616.361787] env[67899]: DEBUG nova.compute.manager [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Received event network-changed-a2679f4f-910e-4df6-a70d-1f2d27ac9c89 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 616.361787] env[67899]: DEBUG nova.compute.manager [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Refreshing instance network info cache due to event network-changed-a2679f4f-910e-4df6-a70d-1f2d27ac9c89. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 616.361787] env[67899]: DEBUG oslo_concurrency.lockutils [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] Acquiring lock "refresh_cache-195a4a1e-3da7-4a69-a679-869346368195" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 616.361787] env[67899]: DEBUG oslo_concurrency.lockutils [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] Acquired lock "refresh_cache-195a4a1e-3da7-4a69-a679-869346368195" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 616.362195] env[67899]: DEBUG nova.network.neutron [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Refreshing network info cache for port a2679f4f-910e-4df6-a70d-1f2d27ac9c89 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 616.581051] env[67899]: DEBUG nova.network.neutron [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Updated VIF entry in instance network info cache for port 1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 616.581627] env[67899]: DEBUG nova.network.neutron [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Updating instance_info_cache with network_info: [{"id": "1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd", "address": "fa:16:3e:f4:b7:c8", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1a746cdf-ff", "ovs_interfaceid": "1a746cdf-ff24-4ef0-9ddb-d3b5d728f7dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 616.593947] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Releasing lock "refresh_cache-267a1016-410e-4097-9523-6fcafc5f4eb0" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 616.594264] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Received event network-changed-99943a77-7035-4bb2-9dfe-6103d3652711 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 616.594438] env[67899]: DEBUG nova.compute.manager [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Refreshing instance network info cache due to event network-changed-99943a77-7035-4bb2-9dfe-6103d3652711. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 616.600928] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Acquiring lock "refresh_cache-91d5024f-9eac-4a56-b08f-c0f6a7eda775" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 616.600928] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Acquired lock "refresh_cache-91d5024f-9eac-4a56-b08f-c0f6a7eda775" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 616.600928] env[67899]: DEBUG nova.network.neutron [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Refreshing network info cache for port 99943a77-7035-4bb2-9dfe-6103d3652711 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 617.710745] env[67899]: DEBUG nova.network.neutron [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Updated VIF entry in instance network info cache for port a2679f4f-910e-4df6-a70d-1f2d27ac9c89. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 617.710745] env[67899]: DEBUG nova.network.neutron [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Updating instance_info_cache with network_info: [{"id": "a2679f4f-910e-4df6-a70d-1f2d27ac9c89", "address": "fa:16:3e:ec:49:f1", "network": {"id": "857be8e0-b3fa-4836-87d8-37b0af1d0354", "bridge": "br-int", "label": "tempest-ImagesTestJSON-566779850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a6ddbe6f15c6436197b1b073170d78cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2679f4f-91", "ovs_interfaceid": "a2679f4f-910e-4df6-a70d-1f2d27ac9c89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 617.723456] env[67899]: DEBUG oslo_concurrency.lockutils [req-d5844946-e6c3-4d49-9a68-0d5a973df45d req-648ab501-a662-4fdc-aded-6f9eb0fcf587 service nova] Releasing lock "refresh_cache-195a4a1e-3da7-4a69-a679-869346368195" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 617.773321] env[67899]: DEBUG nova.network.neutron [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Updated VIF entry in instance network info cache for port 99943a77-7035-4bb2-9dfe-6103d3652711. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 617.773321] env[67899]: DEBUG nova.network.neutron [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Updating instance_info_cache with network_info: [{"id": "99943a77-7035-4bb2-9dfe-6103d3652711", "address": "fa:16:3e:a7:52:d0", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.245", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99943a77-70", "ovs_interfaceid": "99943a77-7035-4bb2-9dfe-6103d3652711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 617.782070] env[67899]: DEBUG oslo_concurrency.lockutils [req-19ab0ff1-9629-40ef-82b2-e350bb73d960 req-6ffa4413-4b53-4edd-ab64-ed0f42dd18ac service nova] Releasing lock "refresh_cache-91d5024f-9eac-4a56-b08f-c0f6a7eda775" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 619.324458] env[67899]: DEBUG nova.compute.manager [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Received event network-vif-plugged-e401ded8-39ef-421e-8ea4-d3c958ad9f58 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 619.324458] env[67899]: DEBUG oslo_concurrency.lockutils [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] Acquiring lock "7a19bcfd-5544-4688-8edb-e12c567979ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 619.324458] env[67899]: DEBUG oslo_concurrency.lockutils [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] Lock "7a19bcfd-5544-4688-8edb-e12c567979ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 619.324458] env[67899]: DEBUG oslo_concurrency.lockutils [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] Lock "7a19bcfd-5544-4688-8edb-e12c567979ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 619.329086] env[67899]: DEBUG nova.compute.manager [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] No waiting events found dispatching network-vif-plugged-e401ded8-39ef-421e-8ea4-d3c958ad9f58 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 619.329086] env[67899]: WARNING nova.compute.manager [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Received unexpected event network-vif-plugged-e401ded8-39ef-421e-8ea4-d3c958ad9f58 for instance with vm_state building and task_state spawning. [ 619.329086] env[67899]: DEBUG nova.compute.manager [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Received event network-changed-e401ded8-39ef-421e-8ea4-d3c958ad9f58 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 619.329086] env[67899]: DEBUG nova.compute.manager [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Refreshing instance network info cache due to event network-changed-e401ded8-39ef-421e-8ea4-d3c958ad9f58. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 619.329086] env[67899]: DEBUG oslo_concurrency.lockutils [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] Acquiring lock "refresh_cache-7a19bcfd-5544-4688-8edb-e12c567979ae" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 619.329412] env[67899]: DEBUG oslo_concurrency.lockutils [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] Acquired lock "refresh_cache-7a19bcfd-5544-4688-8edb-e12c567979ae" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 619.329412] env[67899]: DEBUG nova.network.neutron [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Refreshing network info cache for port e401ded8-39ef-421e-8ea4-d3c958ad9f58 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 620.171802] env[67899]: DEBUG nova.network.neutron [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Updated VIF entry in instance network info cache for port e401ded8-39ef-421e-8ea4-d3c958ad9f58. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 620.172167] env[67899]: DEBUG nova.network.neutron [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Updating instance_info_cache with network_info: [{"id": "e401ded8-39ef-421e-8ea4-d3c958ad9f58", "address": "fa:16:3e:57:a4:3f", "network": {"id": "ee680f55-4f5e-4b19-8b86-905c0110eca6", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-626501513-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f54b1e55adf64b4c84219f033093c126", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee9ce73d-4ee8-4b28-b7d3-3a5735039627", "external-id": "cl2-zone-465", "segmentation_id": 465, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape401ded8-39", "ovs_interfaceid": "e401ded8-39ef-421e-8ea4-d3c958ad9f58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.184090] env[67899]: DEBUG oslo_concurrency.lockutils [req-9d51a339-16ad-4c50-8e21-d37305423284 req-abe12eb4-04be-476e-91ae-25dc5c98b6de service nova] Releasing lock "refresh_cache-7a19bcfd-5544-4688-8edb-e12c567979ae" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 621.144960] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquiring lock "84cbacaa-08d2-4297-8777-150f433e4c04" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 621.145839] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "84cbacaa-08d2-4297-8777-150f433e4c04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 621.183828] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 621.272174] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 621.272174] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 621.277372] env[67899]: INFO nova.compute.claims [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 621.497836] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f406a43-8ad7-4cf4-a13b-3bb3d2d70da4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.508596] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78ef76dd-3741-4e60-976f-83aa46b1db06 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.557359] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4efb602e-c54b-4d35-ad5c-eef23b39867c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.565815] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95316a16-0909-4804-a2c0-500e170db50d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.584864] env[67899]: DEBUG nova.compute.provider_tree [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 621.606834] env[67899]: DEBUG nova.scheduler.client.report [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 621.630211] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.358s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 621.630645] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 621.678046] env[67899]: DEBUG nova.compute.utils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 621.679922] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 621.680120] env[67899]: DEBUG nova.network.neutron [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 621.695012] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 621.790577] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 621.817311] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 621.817859] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 621.818059] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 621.818524] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 621.818524] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 621.818524] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 621.818950] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 621.819142] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 621.819319] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 621.819657] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 621.819853] env[67899]: DEBUG nova.virt.hardware [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 621.821110] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20f0aa63-e63e-4d8b-8de2-3164cd94d2cc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.831048] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b5d7b6b-d17d-46d4-a140-d965d52d42ec {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.041832] env[67899]: DEBUG nova.policy [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a53f3f69eb4bb6aed6ada53b563c5c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '973f7550e4e84a51b7bde37eadcb6d38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 623.525049] env[67899]: DEBUG nova.network.neutron [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Successfully created port: 45ead234-af7b-4897-824f-9fdc82e9c69e {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 624.941568] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquiring lock "c29ae4c5-cc93-480c-8d60-96f6acba4346" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 624.941889] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 624.960033] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 625.037406] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 625.037992] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 625.039982] env[67899]: INFO nova.compute.claims [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 625.285019] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98dac842-a0ff-4b7c-b0b2-6152f648de10 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.295303] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63ab14c0-ffd2-47c2-a62b-2a9986433135 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.344219] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdf5bbac-d0af-4a98-be3b-1f118bf3d22c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.356730] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee1bf458-012f-4768-84db-de652025218b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.928185] env[67899]: DEBUG nova.compute.provider_tree [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 625.930113] env[67899]: DEBUG nova.network.neutron [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Successfully updated port: 45ead234-af7b-4897-824f-9fdc82e9c69e {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 625.943855] env[67899]: DEBUG nova.scheduler.client.report [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 625.947159] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquiring lock "refresh_cache-84cbacaa-08d2-4297-8777-150f433e4c04" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 625.947388] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquired lock "refresh_cache-84cbacaa-08d2-4297-8777-150f433e4c04" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 625.947481] env[67899]: DEBUG nova.network.neutron [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 625.966296] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.929s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 625.966813] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 626.017437] env[67899]: DEBUG nova.compute.utils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 626.017805] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 626.017975] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 626.030561] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 626.091770] env[67899]: DEBUG nova.network.neutron [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.133164] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 626.169119] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 626.169280] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 626.169428] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 626.169852] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 626.169852] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 626.169852] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 626.170309] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 626.170309] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 626.170464] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 626.170570] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 626.170739] env[67899]: DEBUG nova.virt.hardware [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 626.172237] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-710e5b2e-dcda-4580-9196-f9bfb207d02c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 626.181732] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e354ad7d-5540-4dc8-aff5-a9e7f94cfbb5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 626.255594] env[67899]: DEBUG nova.policy [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '01878dd5bd734b4ab4dfc6eb6b19dbc7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '970e538ebd0844ddaace0fc9e294f283', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 626.629900] env[67899]: DEBUG nova.compute.manager [req-9b82a891-b802-45b4-9b3e-53e5777f8062 req-c1565dca-99e6-4cf3-86b9-27804f96a0c7 service nova] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Received event network-vif-plugged-45ead234-af7b-4897-824f-9fdc82e9c69e {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 626.634729] env[67899]: DEBUG oslo_concurrency.lockutils [req-9b82a891-b802-45b4-9b3e-53e5777f8062 req-c1565dca-99e6-4cf3-86b9-27804f96a0c7 service nova] Acquiring lock "84cbacaa-08d2-4297-8777-150f433e4c04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 626.634729] env[67899]: DEBUG oslo_concurrency.lockutils [req-9b82a891-b802-45b4-9b3e-53e5777f8062 req-c1565dca-99e6-4cf3-86b9-27804f96a0c7 service nova] Lock "84cbacaa-08d2-4297-8777-150f433e4c04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 626.634729] env[67899]: DEBUG oslo_concurrency.lockutils [req-9b82a891-b802-45b4-9b3e-53e5777f8062 req-c1565dca-99e6-4cf3-86b9-27804f96a0c7 service nova] Lock "84cbacaa-08d2-4297-8777-150f433e4c04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 626.634729] env[67899]: DEBUG nova.compute.manager [req-9b82a891-b802-45b4-9b3e-53e5777f8062 req-c1565dca-99e6-4cf3-86b9-27804f96a0c7 service nova] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] No waiting events found dispatching network-vif-plugged-45ead234-af7b-4897-824f-9fdc82e9c69e {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 626.635793] env[67899]: WARNING nova.compute.manager [req-9b82a891-b802-45b4-9b3e-53e5777f8062 req-c1565dca-99e6-4cf3-86b9-27804f96a0c7 service nova] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Received unexpected event network-vif-plugged-45ead234-af7b-4897-824f-9fdc82e9c69e for instance with vm_state building and task_state spawning. [ 626.979352] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquiring lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 626.979691] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 626.998198] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 627.045541] env[67899]: DEBUG nova.network.neutron [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Updating instance_info_cache with network_info: [{"id": "45ead234-af7b-4897-824f-9fdc82e9c69e", "address": "fa:16:3e:66:01:45", "network": {"id": "84e921e8-f0fc-4f86-9f9d-605c67ed429f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-317458798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "973f7550e4e84a51b7bde37eadcb6d38", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45ead234-af", "ovs_interfaceid": "45ead234-af7b-4897-824f-9fdc82e9c69e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.069495] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Releasing lock "refresh_cache-84cbacaa-08d2-4297-8777-150f433e4c04" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 627.069841] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Instance network_info: |[{"id": "45ead234-af7b-4897-824f-9fdc82e9c69e", "address": "fa:16:3e:66:01:45", "network": {"id": "84e921e8-f0fc-4f86-9f9d-605c67ed429f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-317458798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "973f7550e4e84a51b7bde37eadcb6d38", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45ead234-af", "ovs_interfaceid": "45ead234-af7b-4897-824f-9fdc82e9c69e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 627.070658] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:66:01:45', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '52f465cb-7418-4172-bd7d-aec00abeb692', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '45ead234-af7b-4897-824f-9fdc82e9c69e', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 627.086781] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Creating folder: Project (973f7550e4e84a51b7bde37eadcb6d38). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 627.087832] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4bb61c04-d097-440e-b184-4896aa8fbe60 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.106963] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Created folder: Project (973f7550e4e84a51b7bde37eadcb6d38) in parent group-v692900. [ 627.108204] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Creating folder: Instances. Parent ref: group-v692919. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 627.108828] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-75d9c7d4-eb7d-441e-8dc5-849e31b5b510 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.116105] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 627.116381] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 627.118053] env[67899]: INFO nova.compute.claims [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 627.122295] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Created folder: Instances in parent group-v692919. [ 627.122857] env[67899]: DEBUG oslo.service.loopingcall [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 627.122857] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 627.122973] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-328bb688-d883-4a6e-9ba4-e371b63e1b91 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.146023] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 627.146023] env[67899]: value = "task-3467833" [ 627.146023] env[67899]: _type = "Task" [ 627.146023] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 627.153246] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467833, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 627.399115] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e73e4ab4-e079-47a0-a5e0-8d12548cb3a1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.411754] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bde737e-ff3e-45b1-8d04-59cbbf5ec8f9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.451015] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-884f987b-5e03-4b75-bba5-7e187cc32fc4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.459893] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fa5b2fb-2654-44aa-87df-89c655dcd732 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.475249] env[67899]: DEBUG nova.compute.provider_tree [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.485931] env[67899]: DEBUG nova.scheduler.client.report [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.490205] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Successfully created port: da099c05-3a93-40aa-80ed-e692d53dc2ad {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 627.506374] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.390s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 627.506881] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 627.583717] env[67899]: DEBUG nova.compute.utils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 627.589591] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 627.589591] env[67899]: DEBUG nova.network.neutron [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 627.598779] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 627.659940] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467833, 'name': CreateVM_Task, 'duration_secs': 0.381855} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 627.660835] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 627.665501] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 627.665667] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 627.667311] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 627.669164] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8d5ac7f2-46a6-45d7-b7d5-828b47cde447 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.677331] env[67899]: DEBUG oslo_vmware.api [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Waiting for the task: (returnval){ [ 627.677331] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52e0018e-12f6-09b6-79cb-6a2ba354655a" [ 627.677331] env[67899]: _type = "Task" [ 627.677331] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 627.690690] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 627.690942] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 627.691152] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 627.713807] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 627.748270] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 627.748270] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 627.748270] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 627.748579] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 627.748579] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 627.748579] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 627.748579] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 627.748579] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 627.748744] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 627.748744] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 627.748744] env[67899]: DEBUG nova.virt.hardware [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 627.749716] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e01c2d5e-a0c0-4cd5-9652-a6ce754dfeeb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.759762] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed89103b-6670-415d-a0aa-b32874b48fd0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.800851] env[67899]: DEBUG nova.policy [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '99659fcfaa4b41afb80defbf9b92b66e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2745a593bf56467f906afcddcfa13182', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 628.690973] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquiring lock "913c5652-c8af-41a8-94f1-c0eba08aacdd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 628.691665] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "913c5652-c8af-41a8-94f1-c0eba08aacdd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 628.707740] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 628.796457] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 628.796726] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 628.799582] env[67899]: INFO nova.compute.claims [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 628.962193] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Successfully created port: 033bfeb8-6332-46bd-92b0-3503feb9f14a {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 629.158867] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2dc4714-0cc2-4e79-9ed9-5824e1e46d53 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.168205] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bb1ad95-b9f4-4f17-adbe-644a7d8ac6e2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.203863] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4912ddb1-1d78-4096-ba5e-385348f77119 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.212449] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21c588e0-d293-4fc0-8eca-55ee737cef4e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.227515] env[67899]: DEBUG nova.compute.provider_tree [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 629.250480] env[67899]: DEBUG nova.scheduler.client.report [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 629.275397] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.476s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 629.275397] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 629.327311] env[67899]: DEBUG nova.compute.utils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 629.329280] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 629.329467] env[67899]: DEBUG nova.network.neutron [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 629.342073] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 629.424952] env[67899]: DEBUG nova.policy [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b0ab829aa21b44b58158f09b15884294', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '42c5eb67c36c48e88b2e47dcaacd4608', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 629.431406] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 629.463323] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 629.463572] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 629.463731] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 629.463914] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 629.464072] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 629.464221] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 629.464539] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 629.464729] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 629.464904] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 629.465080] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 629.465258] env[67899]: DEBUG nova.virt.hardware [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 629.466652] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-889d61aa-27d7-4b05-9d01-92a9f7c78f8a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.475247] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bcf06d7-3086-4531-995e-345700bd97e7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.482018] env[67899]: DEBUG nova.network.neutron [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Successfully created port: 584372af-c42e-4b13-8701-adfb729629f4 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 630.011181] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquiring lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 630.013921] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 630.141794] env[67899]: DEBUG nova.network.neutron [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Successfully created port: fb5fa11f-22c8-4109-9547-f4aa282cd244 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 630.616273] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Successfully created port: 25f4cc80-6ee0-47e9-be13-34d172ff4aaf {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 631.613998] env[67899]: DEBUG nova.network.neutron [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Successfully updated port: fb5fa11f-22c8-4109-9547-f4aa282cd244 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 631.633274] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquiring lock "refresh_cache-913c5652-c8af-41a8-94f1-c0eba08aacdd" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 631.633439] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquired lock "refresh_cache-913c5652-c8af-41a8-94f1-c0eba08aacdd" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 631.633618] env[67899]: DEBUG nova.network.neutron [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 631.684181] env[67899]: DEBUG nova.network.neutron [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 631.955099] env[67899]: DEBUG nova.compute.manager [req-a8a46724-0413-4eea-adb1-beca1c91b3d3 req-e634dd8d-9872-43d7-8ba8-e7d38076a35a service nova] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Received event network-changed-45ead234-af7b-4897-824f-9fdc82e9c69e {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 631.955304] env[67899]: DEBUG nova.compute.manager [req-a8a46724-0413-4eea-adb1-beca1c91b3d3 req-e634dd8d-9872-43d7-8ba8-e7d38076a35a service nova] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Refreshing instance network info cache due to event network-changed-45ead234-af7b-4897-824f-9fdc82e9c69e. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 631.955528] env[67899]: DEBUG oslo_concurrency.lockutils [req-a8a46724-0413-4eea-adb1-beca1c91b3d3 req-e634dd8d-9872-43d7-8ba8-e7d38076a35a service nova] Acquiring lock "refresh_cache-84cbacaa-08d2-4297-8777-150f433e4c04" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 631.955675] env[67899]: DEBUG oslo_concurrency.lockutils [req-a8a46724-0413-4eea-adb1-beca1c91b3d3 req-e634dd8d-9872-43d7-8ba8-e7d38076a35a service nova] Acquired lock "refresh_cache-84cbacaa-08d2-4297-8777-150f433e4c04" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 631.955865] env[67899]: DEBUG nova.network.neutron [req-a8a46724-0413-4eea-adb1-beca1c91b3d3 req-e634dd8d-9872-43d7-8ba8-e7d38076a35a service nova] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Refreshing network info cache for port 45ead234-af7b-4897-824f-9fdc82e9c69e {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 632.184989] env[67899]: DEBUG nova.network.neutron [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Updating instance_info_cache with network_info: [{"id": "fb5fa11f-22c8-4109-9547-f4aa282cd244", "address": "fa:16:3e:cf:1d:49", "network": {"id": "c0f1032d-cf3e-48f6-bb99-f5f9859bf677", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-887782125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "42c5eb67c36c48e88b2e47dcaacd4608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfb5fa11f-22", "ovs_interfaceid": "fb5fa11f-22c8-4109-9547-f4aa282cd244", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 632.197187] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Releasing lock "refresh_cache-913c5652-c8af-41a8-94f1-c0eba08aacdd" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 632.197590] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Instance network_info: |[{"id": "fb5fa11f-22c8-4109-9547-f4aa282cd244", "address": "fa:16:3e:cf:1d:49", "network": {"id": "c0f1032d-cf3e-48f6-bb99-f5f9859bf677", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-887782125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "42c5eb67c36c48e88b2e47dcaacd4608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfb5fa11f-22", "ovs_interfaceid": "fb5fa11f-22c8-4109-9547-f4aa282cd244", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 632.198358] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cf:1d:49', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad4fcde7-8926-402a-a9b7-4878d2bc1cf6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fb5fa11f-22c8-4109-9547-f4aa282cd244', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 632.207484] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Creating folder: Project (42c5eb67c36c48e88b2e47dcaacd4608). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 632.208168] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5992a240-3337-48e4-b6cb-191fe9b29cec {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.227610] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Created folder: Project (42c5eb67c36c48e88b2e47dcaacd4608) in parent group-v692900. [ 632.228250] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Creating folder: Instances. Parent ref: group-v692925. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 632.228250] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-04753374-d757-421d-8b74-b12ccc754039 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.240468] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Created folder: Instances in parent group-v692925. [ 632.241896] env[67899]: DEBUG oslo.service.loopingcall [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 632.241896] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 632.241896] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cf625676-ae19-4367-9513-0f6ca3b30a62 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.267208] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 632.267208] env[67899]: value = "task-3467841" [ 632.267208] env[67899]: _type = "Task" [ 632.267208] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 632.275553] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467841, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 632.571770] env[67899]: DEBUG nova.network.neutron [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Successfully updated port: 584372af-c42e-4b13-8701-adfb729629f4 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 632.584803] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquiring lock "refresh_cache-9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 632.584982] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquired lock "refresh_cache-9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 632.585152] env[67899]: DEBUG nova.network.neutron [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 632.739785] env[67899]: DEBUG nova.network.neutron [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 632.782869] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467841, 'name': CreateVM_Task} progress is 25%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 632.901121] env[67899]: DEBUG nova.network.neutron [req-a8a46724-0413-4eea-adb1-beca1c91b3d3 req-e634dd8d-9872-43d7-8ba8-e7d38076a35a service nova] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Updated VIF entry in instance network info cache for port 45ead234-af7b-4897-824f-9fdc82e9c69e. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 632.901121] env[67899]: DEBUG nova.network.neutron [req-a8a46724-0413-4eea-adb1-beca1c91b3d3 req-e634dd8d-9872-43d7-8ba8-e7d38076a35a service nova] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Updating instance_info_cache with network_info: [{"id": "45ead234-af7b-4897-824f-9fdc82e9c69e", "address": "fa:16:3e:66:01:45", "network": {"id": "84e921e8-f0fc-4f86-9f9d-605c67ed429f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-317458798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "973f7550e4e84a51b7bde37eadcb6d38", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52f465cb-7418-4172-bd7d-aec00abeb692", "external-id": "nsx-vlan-transportzone-895", "segmentation_id": 895, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45ead234-af", "ovs_interfaceid": "45ead234-af7b-4897-824f-9fdc82e9c69e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 632.919504] env[67899]: DEBUG oslo_concurrency.lockutils [req-a8a46724-0413-4eea-adb1-beca1c91b3d3 req-e634dd8d-9872-43d7-8ba8-e7d38076a35a service nova] Releasing lock "refresh_cache-84cbacaa-08d2-4297-8777-150f433e4c04" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 633.280683] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467841, 'name': CreateVM_Task, 'duration_secs': 0.657183} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 633.280925] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 633.282190] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 633.282465] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 633.282848] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 633.283774] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a0c95f7c-6b91-4f8e-99f2-8d921221fbe8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.288444] env[67899]: DEBUG oslo_vmware.api [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Waiting for the task: (returnval){ [ 633.288444] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]523d4cc8-0054-72ab-197e-4db17781758e" [ 633.288444] env[67899]: _type = "Task" [ 633.288444] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 633.299791] env[67899]: DEBUG oslo_vmware.api [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]523d4cc8-0054-72ab-197e-4db17781758e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 633.490129] env[67899]: DEBUG nova.network.neutron [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Updating instance_info_cache with network_info: [{"id": "584372af-c42e-4b13-8701-adfb729629f4", "address": "fa:16:3e:3c:8e:e4", "network": {"id": "247f01d1-789b-4922-9985-f1372d6ba427", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-2109521751-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2745a593bf56467f906afcddcfa13182", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "78340140-126f-4ef8-a340-debaa64da3e5", "external-id": "nsx-vlan-transportzone-648", "segmentation_id": 648, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap584372af-c4", "ovs_interfaceid": "584372af-c42e-4b13-8701-adfb729629f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.510579] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Releasing lock "refresh_cache-9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 633.510889] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Instance network_info: |[{"id": "584372af-c42e-4b13-8701-adfb729629f4", "address": "fa:16:3e:3c:8e:e4", "network": {"id": "247f01d1-789b-4922-9985-f1372d6ba427", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-2109521751-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2745a593bf56467f906afcddcfa13182", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "78340140-126f-4ef8-a340-debaa64da3e5", "external-id": "nsx-vlan-transportzone-648", "segmentation_id": 648, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap584372af-c4", "ovs_interfaceid": "584372af-c42e-4b13-8701-adfb729629f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 633.511290] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3c:8e:e4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '78340140-126f-4ef8-a340-debaa64da3e5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '584372af-c42e-4b13-8701-adfb729629f4', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 633.519265] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Creating folder: Project (2745a593bf56467f906afcddcfa13182). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 633.519829] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-719cdca9-d1f2-4e6c-bd69-0f42d15e27e7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.530759] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Created folder: Project (2745a593bf56467f906afcddcfa13182) in parent group-v692900. [ 633.530887] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Creating folder: Instances. Parent ref: group-v692928. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 633.531081] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b02845f9-4786-4f84-b6e3-b08392d60b78 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.543064] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Created folder: Instances in parent group-v692928. [ 633.543311] env[67899]: DEBUG oslo.service.loopingcall [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 633.543492] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 633.543692] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a8943570-5cdd-44bf-82ab-149708e74279 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.562592] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 633.562592] env[67899]: value = "task-3467844" [ 633.562592] env[67899]: _type = "Task" [ 633.562592] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 633.570185] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467844, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 633.753195] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Successfully updated port: da099c05-3a93-40aa-80ed-e692d53dc2ad {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 633.801514] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 633.801784] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 633.801994] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 633.875433] env[67899]: DEBUG nova.compute.manager [req-9eea5f1e-47d1-475b-9609-1c743436f92a req-cf4015bc-b6b5-48c2-984f-0a6b627215c2 service nova] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Received event network-vif-plugged-fb5fa11f-22c8-4109-9547-f4aa282cd244 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 633.876227] env[67899]: DEBUG oslo_concurrency.lockutils [req-9eea5f1e-47d1-475b-9609-1c743436f92a req-cf4015bc-b6b5-48c2-984f-0a6b627215c2 service nova] Acquiring lock "913c5652-c8af-41a8-94f1-c0eba08aacdd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 633.876227] env[67899]: DEBUG oslo_concurrency.lockutils [req-9eea5f1e-47d1-475b-9609-1c743436f92a req-cf4015bc-b6b5-48c2-984f-0a6b627215c2 service nova] Lock "913c5652-c8af-41a8-94f1-c0eba08aacdd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 633.876463] env[67899]: DEBUG oslo_concurrency.lockutils [req-9eea5f1e-47d1-475b-9609-1c743436f92a req-cf4015bc-b6b5-48c2-984f-0a6b627215c2 service nova] Lock "913c5652-c8af-41a8-94f1-c0eba08aacdd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 633.876688] env[67899]: DEBUG nova.compute.manager [req-9eea5f1e-47d1-475b-9609-1c743436f92a req-cf4015bc-b6b5-48c2-984f-0a6b627215c2 service nova] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] No waiting events found dispatching network-vif-plugged-fb5fa11f-22c8-4109-9547-f4aa282cd244 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 633.876888] env[67899]: WARNING nova.compute.manager [req-9eea5f1e-47d1-475b-9609-1c743436f92a req-cf4015bc-b6b5-48c2-984f-0a6b627215c2 service nova] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Received unexpected event network-vif-plugged-fb5fa11f-22c8-4109-9547-f4aa282cd244 for instance with vm_state building and task_state spawning. [ 634.073462] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467844, 'name': CreateVM_Task, 'duration_secs': 0.365657} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 634.073741] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 634.074496] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 634.075670] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 634.075670] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 634.075670] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6389573f-1821-4f7e-b341-dbbfe0c23294 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.082785] env[67899]: DEBUG oslo_vmware.api [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Waiting for the task: (returnval){ [ 634.082785] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]521d9446-62da-9ae2-3010-7519e18db75c" [ 634.082785] env[67899]: _type = "Task" [ 634.082785] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 634.091711] env[67899]: DEBUG oslo_vmware.api [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]521d9446-62da-9ae2-3010-7519e18db75c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 634.601699] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 634.601699] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 634.601699] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 635.150598] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Successfully updated port: 033bfeb8-6332-46bd-92b0-3503feb9f14a {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 635.911748] env[67899]: DEBUG nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Received event network-vif-plugged-584372af-c42e-4b13-8701-adfb729629f4 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 635.911970] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Acquiring lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 635.912266] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 635.912477] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 635.914493] env[67899]: DEBUG nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] No waiting events found dispatching network-vif-plugged-584372af-c42e-4b13-8701-adfb729629f4 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 635.914493] env[67899]: WARNING nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Received unexpected event network-vif-plugged-584372af-c42e-4b13-8701-adfb729629f4 for instance with vm_state building and task_state spawning. [ 635.914788] env[67899]: DEBUG nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Received event network-changed-584372af-c42e-4b13-8701-adfb729629f4 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 635.915061] env[67899]: DEBUG nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Refreshing instance network info cache due to event network-changed-584372af-c42e-4b13-8701-adfb729629f4. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 635.917669] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Acquiring lock "refresh_cache-9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 635.917669] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Acquired lock "refresh_cache-9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 635.917669] env[67899]: DEBUG nova.network.neutron [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Refreshing network info cache for port 584372af-c42e-4b13-8701-adfb729629f4 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 636.721321] env[67899]: DEBUG nova.network.neutron [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Updated VIF entry in instance network info cache for port 584372af-c42e-4b13-8701-adfb729629f4. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 636.721321] env[67899]: DEBUG nova.network.neutron [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Updating instance_info_cache with network_info: [{"id": "584372af-c42e-4b13-8701-adfb729629f4", "address": "fa:16:3e:3c:8e:e4", "network": {"id": "247f01d1-789b-4922-9985-f1372d6ba427", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-2109521751-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2745a593bf56467f906afcddcfa13182", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "78340140-126f-4ef8-a340-debaa64da3e5", "external-id": "nsx-vlan-transportzone-648", "segmentation_id": 648, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap584372af-c4", "ovs_interfaceid": "584372af-c42e-4b13-8701-adfb729629f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.735075] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Releasing lock "refresh_cache-9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 636.735329] env[67899]: DEBUG nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Received event network-vif-plugged-da099c05-3a93-40aa-80ed-e692d53dc2ad {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 636.735515] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Acquiring lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 636.735757] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 636.735882] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 636.736945] env[67899]: DEBUG nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] No waiting events found dispatching network-vif-plugged-da099c05-3a93-40aa-80ed-e692d53dc2ad {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 636.737226] env[67899]: WARNING nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Received unexpected event network-vif-plugged-da099c05-3a93-40aa-80ed-e692d53dc2ad for instance with vm_state building and task_state spawning. [ 636.737418] env[67899]: DEBUG nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Received event network-changed-da099c05-3a93-40aa-80ed-e692d53dc2ad {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 636.737579] env[67899]: DEBUG nova.compute.manager [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Refreshing instance network info cache due to event network-changed-da099c05-3a93-40aa-80ed-e692d53dc2ad. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 636.737774] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Acquiring lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 636.737907] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Acquired lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 636.738196] env[67899]: DEBUG nova.network.neutron [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Refreshing network info cache for port da099c05-3a93-40aa-80ed-e692d53dc2ad {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 636.831362] env[67899]: DEBUG nova.network.neutron [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 637.199885] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Successfully updated port: 25f4cc80-6ee0-47e9-be13-34d172ff4aaf {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 637.221212] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquiring lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 637.221212] env[67899]: DEBUG nova.network.neutron [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.237823] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a801d9d-9c96-40bd-ab1a-c15f0a88e760 req-0abf6ac8-46e2-4153-ab13-710fdb4dafe3 service nova] Releasing lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 637.238310] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquired lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 637.238470] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 637.291509] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 638.508920] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Updating instance_info_cache with network_info: [{"id": "da099c05-3a93-40aa-80ed-e692d53dc2ad", "address": "fa:16:3e:72:c6:64", "network": {"id": "3d0b159d-b6d4-4427-8d13-dbc71daacf5f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-768326734", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.182", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapda099c05-3a", "ovs_interfaceid": "da099c05-3a93-40aa-80ed-e692d53dc2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "033bfeb8-6332-46bd-92b0-3503feb9f14a", "address": "fa:16:3e:25:3e:62", "network": {"id": "cf075da8-22c9-42d7-b142-7e6b85552316", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-242810987", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap033bfeb8-63", "ovs_interfaceid": "033bfeb8-6332-46bd-92b0-3503feb9f14a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "25f4cc80-6ee0-47e9-be13-34d172ff4aaf", "address": "fa:16:3e:00:61:68", "network": {"id": "3d0b159d-b6d4-4427-8d13-dbc71daacf5f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-768326734", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25f4cc80-6e", "ovs_interfaceid": "25f4cc80-6ee0-47e9-be13-34d172ff4aaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.521110] env[67899]: DEBUG nova.compute.manager [req-30ddc230-9b46-4100-b685-c1a4f860d4a8 req-1790801b-d61b-45ae-81ab-6a727fc37863 service nova] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Received event network-changed-fb5fa11f-22c8-4109-9547-f4aa282cd244 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 638.521110] env[67899]: DEBUG nova.compute.manager [req-30ddc230-9b46-4100-b685-c1a4f860d4a8 req-1790801b-d61b-45ae-81ab-6a727fc37863 service nova] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Refreshing instance network info cache due to event network-changed-fb5fa11f-22c8-4109-9547-f4aa282cd244. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 638.521110] env[67899]: DEBUG oslo_concurrency.lockutils [req-30ddc230-9b46-4100-b685-c1a4f860d4a8 req-1790801b-d61b-45ae-81ab-6a727fc37863 service nova] Acquiring lock "refresh_cache-913c5652-c8af-41a8-94f1-c0eba08aacdd" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 638.521110] env[67899]: DEBUG oslo_concurrency.lockutils [req-30ddc230-9b46-4100-b685-c1a4f860d4a8 req-1790801b-d61b-45ae-81ab-6a727fc37863 service nova] Acquired lock "refresh_cache-913c5652-c8af-41a8-94f1-c0eba08aacdd" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 638.521110] env[67899]: DEBUG nova.network.neutron [req-30ddc230-9b46-4100-b685-c1a4f860d4a8 req-1790801b-d61b-45ae-81ab-6a727fc37863 service nova] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Refreshing network info cache for port fb5fa11f-22c8-4109-9547-f4aa282cd244 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 638.524026] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Releasing lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 638.524579] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Instance network_info: |[{"id": "da099c05-3a93-40aa-80ed-e692d53dc2ad", "address": "fa:16:3e:72:c6:64", "network": {"id": "3d0b159d-b6d4-4427-8d13-dbc71daacf5f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-768326734", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.182", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapda099c05-3a", "ovs_interfaceid": "da099c05-3a93-40aa-80ed-e692d53dc2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "033bfeb8-6332-46bd-92b0-3503feb9f14a", "address": "fa:16:3e:25:3e:62", "network": {"id": "cf075da8-22c9-42d7-b142-7e6b85552316", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-242810987", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap033bfeb8-63", "ovs_interfaceid": "033bfeb8-6332-46bd-92b0-3503feb9f14a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "25f4cc80-6ee0-47e9-be13-34d172ff4aaf", "address": "fa:16:3e:00:61:68", "network": {"id": "3d0b159d-b6d4-4427-8d13-dbc71daacf5f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-768326734", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25f4cc80-6e", "ovs_interfaceid": "25f4cc80-6ee0-47e9-be13-34d172ff4aaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 638.525533] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:72:c6:64', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f267bcdd-0daa-4337-9709-5fc060c267d8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'da099c05-3a93-40aa-80ed-e692d53dc2ad', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:25:3e:62', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '033bfeb8-6332-46bd-92b0-3503feb9f14a', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:00:61:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f267bcdd-0daa-4337-9709-5fc060c267d8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '25f4cc80-6ee0-47e9-be13-34d172ff4aaf', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 638.537931] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Creating folder: Project (970e538ebd0844ddaace0fc9e294f283). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.539443] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-828a2338-3abe-4014-9635-8ba0aee6d21a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.552314] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Created folder: Project (970e538ebd0844ddaace0fc9e294f283) in parent group-v692900. [ 638.552706] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Creating folder: Instances. Parent ref: group-v692931. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.554015] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1fa17434-a2be-4195-8199-dc940e74670a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.563603] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Created folder: Instances in parent group-v692931. [ 638.564117] env[67899]: DEBUG oslo.service.loopingcall [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 638.564484] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 638.564900] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b55771bb-6c94-4a80-a2de-fdb9bdae64f2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.600723] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 638.600723] env[67899]: value = "task-3467849" [ 638.600723] env[67899]: _type = "Task" [ 638.600723] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 638.608525] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467849, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 639.116612] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467849, 'name': CreateVM_Task, 'duration_secs': 0.444238} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 639.116612] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 639.116612] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 639.116612] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 639.116612] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 639.116612] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1921cc18-6686-4e57-9383-4d56c7a5090d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.123415] env[67899]: DEBUG oslo_vmware.api [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Waiting for the task: (returnval){ [ 639.123415] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d56120-1bea-3379-790c-4d9d658c9a83" [ 639.123415] env[67899]: _type = "Task" [ 639.123415] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 639.133408] env[67899]: DEBUG oslo_vmware.api [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d56120-1bea-3379-790c-4d9d658c9a83, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 639.288071] env[67899]: DEBUG nova.network.neutron [req-30ddc230-9b46-4100-b685-c1a4f860d4a8 req-1790801b-d61b-45ae-81ab-6a727fc37863 service nova] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Updated VIF entry in instance network info cache for port fb5fa11f-22c8-4109-9547-f4aa282cd244. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 639.288352] env[67899]: DEBUG nova.network.neutron [req-30ddc230-9b46-4100-b685-c1a4f860d4a8 req-1790801b-d61b-45ae-81ab-6a727fc37863 service nova] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Updating instance_info_cache with network_info: [{"id": "fb5fa11f-22c8-4109-9547-f4aa282cd244", "address": "fa:16:3e:cf:1d:49", "network": {"id": "c0f1032d-cf3e-48f6-bb99-f5f9859bf677", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-887782125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "42c5eb67c36c48e88b2e47dcaacd4608", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfb5fa11f-22", "ovs_interfaceid": "fb5fa11f-22c8-4109-9547-f4aa282cd244", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.304302] env[67899]: DEBUG oslo_concurrency.lockutils [req-30ddc230-9b46-4100-b685-c1a4f860d4a8 req-1790801b-d61b-45ae-81ab-6a727fc37863 service nova] Releasing lock "refresh_cache-913c5652-c8af-41a8-94f1-c0eba08aacdd" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 639.641564] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 639.643030] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 639.643030] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 640.015574] env[67899]: DEBUG nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Received event network-vif-plugged-033bfeb8-6332-46bd-92b0-3503feb9f14a {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 640.015971] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Acquiring lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.016736] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 640.016736] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 640.016950] env[67899]: DEBUG nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] No waiting events found dispatching network-vif-plugged-033bfeb8-6332-46bd-92b0-3503feb9f14a {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 640.017249] env[67899]: WARNING nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Received unexpected event network-vif-plugged-033bfeb8-6332-46bd-92b0-3503feb9f14a for instance with vm_state building and task_state spawning. [ 640.017550] env[67899]: DEBUG nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Received event network-changed-033bfeb8-6332-46bd-92b0-3503feb9f14a {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 640.017782] env[67899]: DEBUG nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Refreshing instance network info cache due to event network-changed-033bfeb8-6332-46bd-92b0-3503feb9f14a. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 640.018093] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Acquiring lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 640.018420] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Acquired lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 640.018589] env[67899]: DEBUG nova.network.neutron [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Refreshing network info cache for port 033bfeb8-6332-46bd-92b0-3503feb9f14a {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 640.306908] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.307088] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 640.936869] env[67899]: DEBUG nova.network.neutron [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Updated VIF entry in instance network info cache for port 033bfeb8-6332-46bd-92b0-3503feb9f14a. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 640.936869] env[67899]: DEBUG nova.network.neutron [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Updating instance_info_cache with network_info: [{"id": "da099c05-3a93-40aa-80ed-e692d53dc2ad", "address": "fa:16:3e:72:c6:64", "network": {"id": "3d0b159d-b6d4-4427-8d13-dbc71daacf5f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-768326734", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.182", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapda099c05-3a", "ovs_interfaceid": "da099c05-3a93-40aa-80ed-e692d53dc2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "033bfeb8-6332-46bd-92b0-3503feb9f14a", "address": "fa:16:3e:25:3e:62", "network": {"id": "cf075da8-22c9-42d7-b142-7e6b85552316", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-242810987", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap033bfeb8-63", "ovs_interfaceid": "033bfeb8-6332-46bd-92b0-3503feb9f14a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "25f4cc80-6ee0-47e9-be13-34d172ff4aaf", "address": "fa:16:3e:00:61:68", "network": {"id": "3d0b159d-b6d4-4427-8d13-dbc71daacf5f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-768326734", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25f4cc80-6e", "ovs_interfaceid": "25f4cc80-6ee0-47e9-be13-34d172ff4aaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.953558] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Releasing lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 640.953857] env[67899]: DEBUG nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Received event network-vif-plugged-25f4cc80-6ee0-47e9-be13-34d172ff4aaf {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 640.954457] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Acquiring lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.955825] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 640.956106] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 640.956579] env[67899]: DEBUG nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] No waiting events found dispatching network-vif-plugged-25f4cc80-6ee0-47e9-be13-34d172ff4aaf {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 640.956812] env[67899]: WARNING nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Received unexpected event network-vif-plugged-25f4cc80-6ee0-47e9-be13-34d172ff4aaf for instance with vm_state building and task_state spawning. [ 640.957383] env[67899]: DEBUG nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Received event network-changed-25f4cc80-6ee0-47e9-be13-34d172ff4aaf {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 640.960156] env[67899]: DEBUG nova.compute.manager [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Refreshing instance network info cache due to event network-changed-25f4cc80-6ee0-47e9-be13-34d172ff4aaf. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 640.960156] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Acquiring lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 640.960156] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Acquired lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 640.960156] env[67899]: DEBUG nova.network.neutron [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Refreshing network info cache for port 25f4cc80-6ee0-47e9-be13-34d172ff4aaf {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 641.477428] env[67899]: DEBUG nova.network.neutron [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Updated VIF entry in instance network info cache for port 25f4cc80-6ee0-47e9-be13-34d172ff4aaf. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 641.477428] env[67899]: DEBUG nova.network.neutron [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Updating instance_info_cache with network_info: [{"id": "da099c05-3a93-40aa-80ed-e692d53dc2ad", "address": "fa:16:3e:72:c6:64", "network": {"id": "3d0b159d-b6d4-4427-8d13-dbc71daacf5f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-768326734", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.182", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapda099c05-3a", "ovs_interfaceid": "da099c05-3a93-40aa-80ed-e692d53dc2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "033bfeb8-6332-46bd-92b0-3503feb9f14a", "address": "fa:16:3e:25:3e:62", "network": {"id": "cf075da8-22c9-42d7-b142-7e6b85552316", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-242810987", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap033bfeb8-63", "ovs_interfaceid": "033bfeb8-6332-46bd-92b0-3503feb9f14a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "25f4cc80-6ee0-47e9-be13-34d172ff4aaf", "address": "fa:16:3e:00:61:68", "network": {"id": "3d0b159d-b6d4-4427-8d13-dbc71daacf5f", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-768326734", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.54", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970e538ebd0844ddaace0fc9e294f283", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25f4cc80-6e", "ovs_interfaceid": "25f4cc80-6ee0-47e9-be13-34d172ff4aaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.491744] env[67899]: DEBUG oslo_concurrency.lockutils [req-9e01bee4-d693-4288-95a5-e1239232c1ad req-9a340863-1b99-495d-9e8c-4a0c8a110fd6 service nova] Releasing lock "refresh_cache-c29ae4c5-cc93-480c-8d60-96f6acba4346" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 645.557438] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "4458efe7-18d4-4cfb-b131-e09d36124d68" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 645.557438] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 646.359650] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquiring lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 646.360470] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 649.503583] env[67899]: DEBUG oslo_concurrency.lockutils [None req-095428a0-9722-49dd-850d-4963abda9ddc tempest-ServerDiagnosticsTest-1813722271 tempest-ServerDiagnosticsTest-1813722271-project-member] Acquiring lock "55dfe829-2e96-40d7-bef8-8e7556cbdab3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 649.503907] env[67899]: DEBUG oslo_concurrency.lockutils [None req-095428a0-9722-49dd-850d-4963abda9ddc tempest-ServerDiagnosticsTest-1813722271 tempest-ServerDiagnosticsTest-1813722271-project-member] Lock "55dfe829-2e96-40d7-bef8-8e7556cbdab3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 650.263896] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0bc4c174-e0ab-457c-892d-224b06a89f6b tempest-ServersTestBootFromVolume-1946479315 tempest-ServersTestBootFromVolume-1946479315-project-member] Acquiring lock "f935bfef-3ca7-41fc-89be-c5c4e070a401" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 650.264431] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0bc4c174-e0ab-457c-892d-224b06a89f6b tempest-ServersTestBootFromVolume-1946479315 tempest-ServersTestBootFromVolume-1946479315-project-member] Lock "f935bfef-3ca7-41fc-89be-c5c4e070a401" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 650.275827] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2686c5c7-1784-4988-a00b-1b0654ffb429 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "66d3ec66-244d-4ffa-bd6f-7067f8955e67" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 650.276426] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2686c5c7-1784-4988-a00b-1b0654ffb429 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "66d3ec66-244d-4ffa-bd6f-7067f8955e67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.043561] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6c2c8409-20d5-4bd8-9447-eb20abb6685f tempest-ServersV294TestFqdnHostnames-1781220341 tempest-ServersV294TestFqdnHostnames-1781220341-project-member] Acquiring lock "bbed830d-aa53-4ea4-93f8-d4b198a333cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.044299] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6c2c8409-20d5-4bd8-9447-eb20abb6685f tempest-ServersV294TestFqdnHostnames-1781220341 tempest-ServersV294TestFqdnHostnames-1781220341-project-member] Lock "bbed830d-aa53-4ea4-93f8-d4b198a333cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 655.791263] env[67899]: DEBUG oslo_concurrency.lockutils [None req-88fcedb6-0a11-495c-89af-f775406e92ad tempest-ServerShowV254Test-945843174 tempest-ServerShowV254Test-945843174-project-member] Acquiring lock "786676eb-ac36-48f6-874a-ab1ca15f2a9a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 655.795075] env[67899]: DEBUG oslo_concurrency.lockutils [None req-88fcedb6-0a11-495c-89af-f775406e92ad tempest-ServerShowV254Test-945843174 tempest-ServerShowV254Test-945843174-project-member] Lock "786676eb-ac36-48f6-874a-ab1ca15f2a9a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 656.512782] env[67899]: DEBUG oslo_concurrency.lockutils [None req-84b4cd7b-730e-4d10-a439-50bc81630706 tempest-ServersTestManualDisk-499941049 tempest-ServersTestManualDisk-499941049-project-member] Acquiring lock "4cd5b80b-d1f4-4142-83fb-235523464667" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 656.513130] env[67899]: DEBUG oslo_concurrency.lockutils [None req-84b4cd7b-730e-4d10-a439-50bc81630706 tempest-ServersTestManualDisk-499941049 tempest-ServersTestManualDisk-499941049-project-member] Lock "4cd5b80b-d1f4-4142-83fb-235523464667" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 657.815943] env[67899]: DEBUG oslo_concurrency.lockutils [None req-97a39c26-6bb7-4c5d-8d9f-30d1a9b2c784 tempest-ServerDiagnosticsV248Test-725047975 tempest-ServerDiagnosticsV248Test-725047975-project-member] Acquiring lock "1589188b-8540-4afd-8050-ab47633593c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 657.816266] env[67899]: DEBUG oslo_concurrency.lockutils [None req-97a39c26-6bb7-4c5d-8d9f-30d1a9b2c784 tempest-ServerDiagnosticsV248Test-725047975 tempest-ServerDiagnosticsV248Test-725047975-project-member] Lock "1589188b-8540-4afd-8050-ab47633593c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 661.265686] env[67899]: WARNING oslo_vmware.rw_handles [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 661.265686] env[67899]: ERROR oslo_vmware.rw_handles [ 661.266327] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/26a609a8-404f-4ca4-8454-a3606d468c8b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 661.267480] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 661.267717] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Copying Virtual Disk [datastore1] vmware_temp/26a609a8-404f-4ca4-8454-a3606d468c8b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/26a609a8-404f-4ca4-8454-a3606d468c8b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 661.267998] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b043b300-2609-4331-9739-565faaa6417d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.280412] env[67899]: DEBUG oslo_vmware.api [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Waiting for the task: (returnval){ [ 661.280412] env[67899]: value = "task-3467854" [ 661.280412] env[67899]: _type = "Task" [ 661.280412] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 661.292025] env[67899]: DEBUG oslo_vmware.api [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Task: {'id': task-3467854, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 661.402119] env[67899]: DEBUG oslo_concurrency.lockutils [None req-8927c821-393f-439d-a9bd-800d953f5117 tempest-ServerGroupTestJSON-2086428554 tempest-ServerGroupTestJSON-2086428554-project-member] Acquiring lock "de1d5572-b82d-4edc-9c7e-a7e26c45a090" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 661.402119] env[67899]: DEBUG oslo_concurrency.lockutils [None req-8927c821-393f-439d-a9bd-800d953f5117 tempest-ServerGroupTestJSON-2086428554 tempest-ServerGroupTestJSON-2086428554-project-member] Lock "de1d5572-b82d-4edc-9c7e-a7e26c45a090" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 661.790381] env[67899]: DEBUG oslo_vmware.exceptions [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 661.790381] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 661.794134] env[67899]: ERROR nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 661.794134] env[67899]: Faults: ['InvalidArgument'] [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Traceback (most recent call last): [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] yield resources [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] self.driver.spawn(context, instance, image_meta, [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] self._fetch_image_if_missing(context, vi) [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] image_cache(vi, tmp_image_ds_loc) [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] vm_util.copy_virtual_disk( [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] session._wait_for_task(vmdk_copy_task) [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] return self.wait_for_task(task_ref) [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] return evt.wait() [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] result = hub.switch() [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] return self.greenlet.switch() [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] self.f(*self.args, **self.kw) [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] raise exceptions.translate_fault(task_info.error) [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Faults: ['InvalidArgument'] [ 661.794134] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] [ 661.795177] env[67899]: INFO nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Terminating instance [ 661.796546] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 661.797422] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 661.797743] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 661.797883] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 661.798188] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1328715d-1838-4e10-b40d-0f50482ab251 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.803158] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07a9e2d1-edc8-488a-af99-9beae5fbc76d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.813311] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 661.814224] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f3ee22dd-5f36-4021-a92c-249a54a11a44 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.816966] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 661.816966] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 661.818113] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fa1a7f08-17d6-4ea0-a8d4-ab654b9ec3e6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.822837] env[67899]: DEBUG oslo_vmware.api [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Waiting for the task: (returnval){ [ 661.822837] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]520ff064-8fad-0197-954f-b02a51fbbd35" [ 661.822837] env[67899]: _type = "Task" [ 661.822837] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 661.833534] env[67899]: DEBUG oslo_vmware.api [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]520ff064-8fad-0197-954f-b02a51fbbd35, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 661.888979] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 661.889115] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 661.889966] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Deleting the datastore file [datastore1] 6b16f08c-a470-4a9b-8096-05cec2e960cf {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 661.889966] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ed61a4da-afa4-4db9-9a26-c9f1e98df0b2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.898275] env[67899]: DEBUG oslo_vmware.api [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Waiting for the task: (returnval){ [ 661.898275] env[67899]: value = "task-3467856" [ 661.898275] env[67899]: _type = "Task" [ 661.898275] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 661.909692] env[67899]: DEBUG oslo_vmware.api [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Task: {'id': task-3467856, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 662.334189] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 662.334480] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Creating directory with path [datastore1] vmware_temp/aa6f6def-71c0-40a6-9bc8-adc600b40073/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 662.334763] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e840d000-98c3-4b2e-a68a-600448953520 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.350441] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Created directory with path [datastore1] vmware_temp/aa6f6def-71c0-40a6-9bc8-adc600b40073/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 662.351146] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Fetch image to [datastore1] vmware_temp/aa6f6def-71c0-40a6-9bc8-adc600b40073/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 662.351374] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/aa6f6def-71c0-40a6-9bc8-adc600b40073/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 662.352770] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-209654b6-4453-4229-82b3-93c22187f028 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.361573] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aef9b30-515b-4d16-b3aa-47294a5ee39f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.372328] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0a1cbc1-5be7-4131-9282-e67ba119eba5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.414199] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6b39ba2-aa11-4eca-b014-dd5f6a035450 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.421469] env[67899]: DEBUG oslo_vmware.api [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Task: {'id': task-3467856, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074501} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 662.423732] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 662.423732] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 662.423732] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 662.424124] env[67899]: INFO nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Took 0.63 seconds to destroy the instance on the hypervisor. [ 662.426390] env[67899]: DEBUG nova.compute.claims [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 662.426605] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 662.426824] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 662.431117] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8809100c-94d2-47df-bbf3-549b77bdeaa4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.463127] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 662.551772] env[67899]: DEBUG oslo_vmware.rw_handles [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/aa6f6def-71c0-40a6-9bc8-adc600b40073/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 662.621127] env[67899]: DEBUG oslo_vmware.rw_handles [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 662.621127] env[67899]: DEBUG oslo_vmware.rw_handles [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/aa6f6def-71c0-40a6-9bc8-adc600b40073/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 662.914155] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bed3ad9e-8045-4731-b7eb-f02ac0dfc60a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.921713] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a780062-33ab-4832-a122-480519023ba1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.957593] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f9c3b7c-ef70-4009-b134-174cd1c5bf9d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.966754] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-987ece8d-3a9f-464c-aaf0-2058e06d1c60 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.979983] env[67899]: DEBUG nova.compute.provider_tree [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 662.991632] env[67899]: DEBUG nova.scheduler.client.report [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 663.015035] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.587s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 663.015035] env[67899]: ERROR nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 663.015035] env[67899]: Faults: ['InvalidArgument'] [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Traceback (most recent call last): [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] self.driver.spawn(context, instance, image_meta, [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] self._fetch_image_if_missing(context, vi) [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] image_cache(vi, tmp_image_ds_loc) [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] vm_util.copy_virtual_disk( [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] session._wait_for_task(vmdk_copy_task) [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] return self.wait_for_task(task_ref) [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] return evt.wait() [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] result = hub.switch() [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] return self.greenlet.switch() [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] self.f(*self.args, **self.kw) [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] raise exceptions.translate_fault(task_info.error) [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Faults: ['InvalidArgument'] [ 663.015035] env[67899]: ERROR nova.compute.manager [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] [ 663.016492] env[67899]: DEBUG nova.compute.utils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 663.018313] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Build of instance 6b16f08c-a470-4a9b-8096-05cec2e960cf was re-scheduled: A specified parameter was not correct: fileType [ 663.018313] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 663.019164] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 663.019347] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 663.019518] env[67899]: DEBUG nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 663.019769] env[67899]: DEBUG nova.network.neutron [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 664.146218] env[67899]: DEBUG nova.network.neutron [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 664.164516] env[67899]: INFO nova.compute.manager [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] [instance: 6b16f08c-a470-4a9b-8096-05cec2e960cf] Took 1.14 seconds to deallocate network for instance. [ 664.349671] env[67899]: INFO nova.scheduler.client.report [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Deleted allocations for instance 6b16f08c-a470-4a9b-8096-05cec2e960cf [ 664.381218] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4103a3f2-1648-4fa7-ad2b-b1440fc415e3 tempest-InstanceActionsTestJSON-1459258803 tempest-InstanceActionsTestJSON-1459258803-project-member] Lock "6b16f08c-a470-4a9b-8096-05cec2e960cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 63.440s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 664.416183] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 664.487834] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 664.490469] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 664.497654] env[67899]: INFO nova.compute.claims [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 664.975928] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af7d4198-dd1f-4a78-ad63-924926c0b0a4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.984019] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3743e9b-fdd8-43c6-9838-874f95a94848 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.017853] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abf01659-0beb-4af9-9a9a-b253fcf5715f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.026376] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0705c4b5-2e3b-4861-9e59-d1a847bcff13 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.041519] env[67899]: DEBUG nova.compute.provider_tree [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 665.053978] env[67899]: DEBUG nova.scheduler.client.report [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 665.080554] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.591s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 665.081188] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 665.134503] env[67899]: DEBUG nova.compute.utils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 665.136048] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 665.136320] env[67899]: DEBUG nova.network.neutron [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 665.150370] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 665.247938] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 665.280126] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 665.280456] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 665.280524] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 665.280691] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 665.282687] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 665.282687] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 665.282687] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 665.282687] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 665.282687] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 665.282687] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 665.282687] env[67899]: DEBUG nova.virt.hardware [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 665.285892] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0f6e7d7-03a1-44de-bfe2-2c3e6e89107d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.295860] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d46d669-e8a0-4ea1-bcdb-7fa012578ce9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.363024] env[67899]: DEBUG nova.policy [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b1903c386de3419da0a7aa573bf42381', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ae094799919d4505afa74f9c03ccdf2e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 666.625209] env[67899]: DEBUG nova.network.neutron [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Successfully created port: e8095249-5a8b-4955-ac2a-539c23cc9367 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 667.119243] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquiring lock "8d2a9e20-82d3-44cf-a725-59804debe1cc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 667.119485] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 669.055030] env[67899]: DEBUG nova.network.neutron [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Successfully updated port: e8095249-5a8b-4955-ac2a-539c23cc9367 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 669.088013] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquiring lock "refresh_cache-793d6f98-ed1b-4a78-bcd5-cb796441d64b" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 669.088013] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquired lock "refresh_cache-793d6f98-ed1b-4a78-bcd5-cb796441d64b" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 669.088112] env[67899]: DEBUG nova.network.neutron [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 669.205541] env[67899]: DEBUG nova.network.neutron [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 669.390407] env[67899]: DEBUG oslo_concurrency.lockutils [None req-903b8c8a-e8bb-4076-a4c9-50cf6e2f6cbb tempest-ServerAddressesTestJSON-560381368 tempest-ServerAddressesTestJSON-560381368-project-member] Acquiring lock "e42425fa-6c50-4e76-842b-0bfcccb011c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 669.390637] env[67899]: DEBUG oslo_concurrency.lockutils [None req-903b8c8a-e8bb-4076-a4c9-50cf6e2f6cbb tempest-ServerAddressesTestJSON-560381368 tempest-ServerAddressesTestJSON-560381368-project-member] Lock "e42425fa-6c50-4e76-842b-0bfcccb011c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 669.913443] env[67899]: DEBUG nova.network.neutron [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Updating instance_info_cache with network_info: [{"id": "e8095249-5a8b-4955-ac2a-539c23cc9367", "address": "fa:16:3e:6e:54:ed", "network": {"id": "179390d5-bb51-4331-82e4-249b59ee9d89", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-265560211-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae094799919d4505afa74f9c03ccdf2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6fb0104-186b-4288-b87e-634893f46f01", "external-id": "nsx-vlan-transportzone-73", "segmentation_id": 73, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape8095249-5a", "ovs_interfaceid": "e8095249-5a8b-4955-ac2a-539c23cc9367", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.928527] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Releasing lock "refresh_cache-793d6f98-ed1b-4a78-bcd5-cb796441d64b" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 669.928858] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Instance network_info: |[{"id": "e8095249-5a8b-4955-ac2a-539c23cc9367", "address": "fa:16:3e:6e:54:ed", "network": {"id": "179390d5-bb51-4331-82e4-249b59ee9d89", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-265560211-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae094799919d4505afa74f9c03ccdf2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6fb0104-186b-4288-b87e-634893f46f01", "external-id": "nsx-vlan-transportzone-73", "segmentation_id": 73, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape8095249-5a", "ovs_interfaceid": "e8095249-5a8b-4955-ac2a-539c23cc9367", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 669.929292] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6e:54:ed', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f6fb0104-186b-4288-b87e-634893f46f01', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e8095249-5a8b-4955-ac2a-539c23cc9367', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 669.945577] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Creating folder: Project (ae094799919d4505afa74f9c03ccdf2e). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 669.946844] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f7749d4b-1dd6-4c5e-a602-77ed87b42dca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.959421] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Created folder: Project (ae094799919d4505afa74f9c03ccdf2e) in parent group-v692900. [ 669.959623] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Creating folder: Instances. Parent ref: group-v692935. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 669.959953] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-96295bd3-cb7c-4bea-9347-514198a83767 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.969717] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Created folder: Instances in parent group-v692935. [ 669.969988] env[67899]: DEBUG oslo.service.loopingcall [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 669.970207] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 669.970418] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-259709a7-6da4-4132-8968-0aba7d574117 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.991928] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 669.991928] env[67899]: value = "task-3467859" [ 669.991928] env[67899]: _type = "Task" [ 669.991928] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 670.002437] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467859, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 670.488256] env[67899]: DEBUG oslo_concurrency.lockutils [None req-06ca25cb-cd45-4102-b595-3289090f9f6e tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] Acquiring lock "96d79732-9076-4715-aa1e-60001ffb17fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.488256] env[67899]: DEBUG oslo_concurrency.lockutils [None req-06ca25cb-cd45-4102-b595-3289090f9f6e tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] Lock "96d79732-9076-4715-aa1e-60001ffb17fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.503995] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467859, 'name': CreateVM_Task, 'duration_secs': 0.319403} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 670.503995] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 670.503995] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 670.503995] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 670.503995] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 670.503995] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-14211918-ecf9-4dcb-8030-aeda0c9f77f9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.509099] env[67899]: DEBUG oslo_vmware.api [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Waiting for the task: (returnval){ [ 670.509099] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52c8a3a8-28bd-2d13-2d7a-b3a71d22e075" [ 670.509099] env[67899]: _type = "Task" [ 670.509099] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 670.517914] env[67899]: DEBUG oslo_vmware.api [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52c8a3a8-28bd-2d13-2d7a-b3a71d22e075, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 670.575871] env[67899]: DEBUG nova.compute.manager [req-f9b6819f-e023-4fc5-ab63-434a9b254eb0 req-69d3fd48-86d5-4993-949d-08bba70dd1b9 service nova] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Received event network-vif-plugged-e8095249-5a8b-4955-ac2a-539c23cc9367 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 670.576104] env[67899]: DEBUG oslo_concurrency.lockutils [req-f9b6819f-e023-4fc5-ab63-434a9b254eb0 req-69d3fd48-86d5-4993-949d-08bba70dd1b9 service nova] Acquiring lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.576325] env[67899]: DEBUG oslo_concurrency.lockutils [req-f9b6819f-e023-4fc5-ab63-434a9b254eb0 req-69d3fd48-86d5-4993-949d-08bba70dd1b9 service nova] Lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.576478] env[67899]: DEBUG oslo_concurrency.lockutils [req-f9b6819f-e023-4fc5-ab63-434a9b254eb0 req-69d3fd48-86d5-4993-949d-08bba70dd1b9 service nova] Lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 670.576638] env[67899]: DEBUG nova.compute.manager [req-f9b6819f-e023-4fc5-ab63-434a9b254eb0 req-69d3fd48-86d5-4993-949d-08bba70dd1b9 service nova] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] No waiting events found dispatching network-vif-plugged-e8095249-5a8b-4955-ac2a-539c23cc9367 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 670.576797] env[67899]: WARNING nova.compute.manager [req-f9b6819f-e023-4fc5-ab63-434a9b254eb0 req-69d3fd48-86d5-4993-949d-08bba70dd1b9 service nova] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Received unexpected event network-vif-plugged-e8095249-5a8b-4955-ac2a-539c23cc9367 for instance with vm_state building and task_state spawning. [ 671.022698] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 671.022954] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 671.023489] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 671.860993] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f014c71-1629-4992-b9cc-369400efe2b9 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] Acquiring lock "f7888060-430b-4b16-b9ca-059020615dee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 671.861546] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f014c71-1629-4992-b9cc-369400efe2b9 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] Lock "f7888060-430b-4b16-b9ca-059020615dee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.495642] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 672.496600] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 672.533456] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 672.533456] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 672.533456] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 672.560138] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.562269] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.562614] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.562888] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.563131] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.563358] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.563595] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.565215] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.565417] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.565644] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 672.565774] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 672.566780] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 672.583673] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 672.583912] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.584547] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 672.584547] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 672.585378] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76dfdc51-bf79-42f0-af2c-559f7d4399b9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.594989] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89a1c13b-2c7e-4765-b962-6639e7e3b9ba {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.611724] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb97e810-c184-4791-b24f-be1ff88f87eb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.623199] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2d32443-9ee0-463c-bee2-4c7e89b5990b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.653510] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180935MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 672.653840] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 672.654573] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.750678] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 91d5024f-9eac-4a56-b08f-c0f6a7eda775 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.750793] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.750901] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 267a1016-410e-4097-9523-6fcafc5f4eb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.751053] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 195a4a1e-3da7-4a69-a679-869346368195 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.751180] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a19bcfd-5544-4688-8edb-e12c567979ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.751299] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 84cbacaa-08d2-4297-8777-150f433e4c04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.751415] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c29ae4c5-cc93-480c-8d60-96f6acba4346 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.751529] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.751644] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 913c5652-c8af-41a8-94f1-c0eba08aacdd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.751756] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 672.783300] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.814866] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.826692] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.842851] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 55dfe829-2e96-40d7-bef8-8e7556cbdab3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.859106] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f935bfef-3ca7-41fc-89be-c5c4e070a401 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.871366] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 66d3ec66-244d-4ffa-bd6f-7067f8955e67 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.883756] env[67899]: DEBUG oslo_concurrency.lockutils [None req-35796b21-4216-449c-a9a9-9fe6ecab3f1b tempest-ServerPasswordTestJSON-1822365130 tempest-ServerPasswordTestJSON-1822365130-project-member] Acquiring lock "1505bcf5-f622-40ee-93c2-8dabf1dce8cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 672.883984] env[67899]: DEBUG oslo_concurrency.lockutils [None req-35796b21-4216-449c-a9a9-9fe6ecab3f1b tempest-ServerPasswordTestJSON-1822365130 tempest-ServerPasswordTestJSON-1822365130-project-member] Lock "1505bcf5-f622-40ee-93c2-8dabf1dce8cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.884496] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bbed830d-aa53-4ea4-93f8-d4b198a333cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.897811] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 786676eb-ac36-48f6-874a-ab1ca15f2a9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.909991] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4cd5b80b-d1f4-4142-83fb-235523464667 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.924023] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 1589188b-8540-4afd-8050-ab47633593c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.938960] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance de1d5572-b82d-4edc-9c7e-a7e26c45a090 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.951530] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.966017] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e42425fa-6c50-4e76-842b-0bfcccb011c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 672.984377] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 96d79732-9076-4715-aa1e-60001ffb17fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 673.005055] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f7888060-430b-4b16-b9ca-059020615dee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 673.024814] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 1505bcf5-f622-40ee-93c2-8dabf1dce8cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 673.024814] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 673.024814] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 673.458088] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c66bb2b4-8a3f-47c2-ae02-9e5b1dff00ff {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.467471] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-987d7c55-c62c-4b4b-9b7c-73b03ab1ab56 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.503357] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6f90de9-2ed8-4966-898f-e4f05b530e5f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.511234] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d188bf8-c6fa-4cbb-8d8f-27880873ae58 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.525300] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 673.536979] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 673.566362] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 673.566667] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.913s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.976708] env[67899]: DEBUG nova.compute.manager [req-320e1f9f-7491-4202-b5b8-fbfa9869e6ce req-01a04abd-5b18-4f97-a383-14d1325af875 service nova] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Received event network-changed-e8095249-5a8b-4955-ac2a-539c23cc9367 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 673.978894] env[67899]: DEBUG nova.compute.manager [req-320e1f9f-7491-4202-b5b8-fbfa9869e6ce req-01a04abd-5b18-4f97-a383-14d1325af875 service nova] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Refreshing instance network info cache due to event network-changed-e8095249-5a8b-4955-ac2a-539c23cc9367. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 673.978894] env[67899]: DEBUG oslo_concurrency.lockutils [req-320e1f9f-7491-4202-b5b8-fbfa9869e6ce req-01a04abd-5b18-4f97-a383-14d1325af875 service nova] Acquiring lock "refresh_cache-793d6f98-ed1b-4a78-bcd5-cb796441d64b" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 673.978894] env[67899]: DEBUG oslo_concurrency.lockutils [req-320e1f9f-7491-4202-b5b8-fbfa9869e6ce req-01a04abd-5b18-4f97-a383-14d1325af875 service nova] Acquired lock "refresh_cache-793d6f98-ed1b-4a78-bcd5-cb796441d64b" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 673.978894] env[67899]: DEBUG nova.network.neutron [req-320e1f9f-7491-4202-b5b8-fbfa9869e6ce req-01a04abd-5b18-4f97-a383-14d1325af875 service nova] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Refreshing network info cache for port e8095249-5a8b-4955-ac2a-539c23cc9367 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 673.997058] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 673.997058] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 673.997058] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 673.997058] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 673.997058] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 673.997058] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 673.997058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 674.569627] env[67899]: DEBUG nova.network.neutron [req-320e1f9f-7491-4202-b5b8-fbfa9869e6ce req-01a04abd-5b18-4f97-a383-14d1325af875 service nova] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Updated VIF entry in instance network info cache for port e8095249-5a8b-4955-ac2a-539c23cc9367. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 674.570130] env[67899]: DEBUG nova.network.neutron [req-320e1f9f-7491-4202-b5b8-fbfa9869e6ce req-01a04abd-5b18-4f97-a383-14d1325af875 service nova] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Updating instance_info_cache with network_info: [{"id": "e8095249-5a8b-4955-ac2a-539c23cc9367", "address": "fa:16:3e:6e:54:ed", "network": {"id": "179390d5-bb51-4331-82e4-249b59ee9d89", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-265560211-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae094799919d4505afa74f9c03ccdf2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6fb0104-186b-4288-b87e-634893f46f01", "external-id": "nsx-vlan-transportzone-73", "segmentation_id": 73, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape8095249-5a", "ovs_interfaceid": "e8095249-5a8b-4955-ac2a-539c23cc9367", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 674.590341] env[67899]: DEBUG oslo_concurrency.lockutils [req-320e1f9f-7491-4202-b5b8-fbfa9869e6ce req-01a04abd-5b18-4f97-a383-14d1325af875 service nova] Releasing lock "refresh_cache-793d6f98-ed1b-4a78-bcd5-cb796441d64b" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 679.596623] env[67899]: DEBUG oslo_concurrency.lockutils [None req-387e0548-0c20-4821-a34f-501df96dea85 tempest-AttachInterfacesV270Test-551286907 tempest-AttachInterfacesV270Test-551286907-project-member] Acquiring lock "4c281caa-f99d-40d5-b004-13e7856a29f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 679.596881] env[67899]: DEBUG oslo_concurrency.lockutils [None req-387e0548-0c20-4821-a34f-501df96dea85 tempest-AttachInterfacesV270Test-551286907 tempest-AttachInterfacesV270Test-551286907-project-member] Lock "4c281caa-f99d-40d5-b004-13e7856a29f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 681.622097] env[67899]: DEBUG oslo_concurrency.lockutils [None req-60abf12b-7e5a-42dd-96b7-3336f10ab46a tempest-InstanceActionsNegativeTestJSON-2021516092 tempest-InstanceActionsNegativeTestJSON-2021516092-project-member] Acquiring lock "aa6229be-c18c-4cf9-99a1-ca546b30d797" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 681.622097] env[67899]: DEBUG oslo_concurrency.lockutils [None req-60abf12b-7e5a-42dd-96b7-3336f10ab46a tempest-InstanceActionsNegativeTestJSON-2021516092 tempest-InstanceActionsNegativeTestJSON-2021516092-project-member] Lock "aa6229be-c18c-4cf9-99a1-ca546b30d797" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 681.624735] env[67899]: DEBUG oslo_concurrency.lockutils [None req-828045de-c2cc-4771-be1a-f63e2fc9d20a tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] Acquiring lock "862297c3-0b85-43eb-b364-303bb0c0b077" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 681.624959] env[67899]: DEBUG oslo_concurrency.lockutils [None req-828045de-c2cc-4771-be1a-f63e2fc9d20a tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] Lock "862297c3-0b85-43eb-b364-303bb0c0b077" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 682.970650] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ec5b86f6-cc1c-4c25-84be-c14c32124342 tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] Acquiring lock "641b8e97-b9e6-4ef0-a819-42d3a29429de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 682.970949] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ec5b86f6-cc1c-4c25-84be-c14c32124342 tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] Lock "641b8e97-b9e6-4ef0-a819-42d3a29429de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 691.820368] env[67899]: DEBUG oslo_concurrency.lockutils [None req-59a8bff3-3a22-4f4e-b4f3-5e8968fb244c tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] Acquiring lock "d0ceaa4e-9c87-48de-bcc2-8bb537827c0a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 691.820368] env[67899]: DEBUG oslo_concurrency.lockutils [None req-59a8bff3-3a22-4f4e-b4f3-5e8968fb244c tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] Lock "d0ceaa4e-9c87-48de-bcc2-8bb537827c0a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 692.769409] env[67899]: DEBUG oslo_concurrency.lockutils [None req-80443c34-53f4-4de0-af11-620e89f407a2 tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] Acquiring lock "9842d097-f4f2-4f60-aea0-08896a47ff53" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 692.769628] env[67899]: DEBUG oslo_concurrency.lockutils [None req-80443c34-53f4-4de0-af11-620e89f407a2 tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] Lock "9842d097-f4f2-4f60-aea0-08896a47ff53" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 711.122072] env[67899]: WARNING oslo_vmware.rw_handles [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 711.122072] env[67899]: ERROR oslo_vmware.rw_handles [ 711.122072] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/aa6f6def-71c0-40a6-9bc8-adc600b40073/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 711.123580] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 711.123807] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Copying Virtual Disk [datastore1] vmware_temp/aa6f6def-71c0-40a6-9bc8-adc600b40073/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/aa6f6def-71c0-40a6-9bc8-adc600b40073/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 711.124108] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-02db87d0-6e5e-4162-b9d1-b828675bc1fc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.131579] env[67899]: DEBUG oslo_vmware.api [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Waiting for the task: (returnval){ [ 711.131579] env[67899]: value = "task-3467860" [ 711.131579] env[67899]: _type = "Task" [ 711.131579] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 711.139723] env[67899]: DEBUG oslo_vmware.api [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Task: {'id': task-3467860, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 711.641749] env[67899]: DEBUG oslo_vmware.exceptions [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 711.642052] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 711.642771] env[67899]: ERROR nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 711.642771] env[67899]: Faults: ['InvalidArgument'] [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Traceback (most recent call last): [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] yield resources [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] self.driver.spawn(context, instance, image_meta, [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] self._fetch_image_if_missing(context, vi) [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] image_cache(vi, tmp_image_ds_loc) [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] vm_util.copy_virtual_disk( [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] session._wait_for_task(vmdk_copy_task) [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] return self.wait_for_task(task_ref) [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] return evt.wait() [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] result = hub.switch() [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] return self.greenlet.switch() [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] self.f(*self.args, **self.kw) [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] raise exceptions.translate_fault(task_info.error) [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Faults: ['InvalidArgument'] [ 711.642771] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] [ 711.643682] env[67899]: INFO nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Terminating instance [ 711.645898] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 711.646145] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 711.646473] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 711.646714] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 711.647507] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92c3d9f0-ef29-4b6c-b4c6-678e5e9863a1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.650353] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b3910d40-cbe9-40e7-a787-672ae2cd6c14 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.656823] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 711.657180] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-859da0dc-0915-463e-9303-3cf4a5ebc169 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.659538] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 711.659714] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 711.660694] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1027b70d-c909-4614-af55-5540eea76fed {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.665726] env[67899]: DEBUG oslo_vmware.api [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Waiting for the task: (returnval){ [ 711.665726] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52bf6f99-faa4-d12b-18f8-07e23b2ea46a" [ 711.665726] env[67899]: _type = "Task" [ 711.665726] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 711.673663] env[67899]: DEBUG oslo_vmware.api [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52bf6f99-faa4-d12b-18f8-07e23b2ea46a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 711.724145] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 711.724407] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 711.724591] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Deleting the datastore file [datastore1] 267a1016-410e-4097-9523-6fcafc5f4eb0 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 711.724850] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-feea5519-022b-4b7c-a06a-de9061619b36 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.731362] env[67899]: DEBUG oslo_vmware.api [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Waiting for the task: (returnval){ [ 711.731362] env[67899]: value = "task-3467862" [ 711.731362] env[67899]: _type = "Task" [ 711.731362] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 711.739498] env[67899]: DEBUG oslo_vmware.api [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Task: {'id': task-3467862, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 712.176068] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 712.176348] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Creating directory with path [datastore1] vmware_temp/2946308a-d2d4-4a7c-aae5-cbaaa9e22d45/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 712.176570] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-19fd61f3-7fa5-4809-b080-fc6a651c761b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.188822] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Created directory with path [datastore1] vmware_temp/2946308a-d2d4-4a7c-aae5-cbaaa9e22d45/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 712.189033] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Fetch image to [datastore1] vmware_temp/2946308a-d2d4-4a7c-aae5-cbaaa9e22d45/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 712.189219] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/2946308a-d2d4-4a7c-aae5-cbaaa9e22d45/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 712.189970] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03010ff8-a3e4-4f09-a48c-7e54d066f047 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.197019] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a50e0eea-8419-4eb2-bf85-2fe6182cdd77 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.205352] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b93b236-5040-4bf5-bfa8-a8dcd5585495 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.238335] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c9f427a-a527-4c19-8ab6-449b62e69308 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.246971] env[67899]: DEBUG oslo_vmware.api [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Task: {'id': task-3467862, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075684} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 712.247291] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 712.247477] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 712.247643] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 712.247811] env[67899]: INFO nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 712.249845] env[67899]: DEBUG nova.compute.claims [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 712.250020] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 712.250235] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.253497] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-afb9c392-35ac-4358-a7bc-62287198eb9f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.276888] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 712.338323] env[67899]: DEBUG oslo_vmware.rw_handles [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2946308a-d2d4-4a7c-aae5-cbaaa9e22d45/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 712.399451] env[67899]: DEBUG oslo_vmware.rw_handles [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 712.399451] env[67899]: DEBUG oslo_vmware.rw_handles [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2946308a-d2d4-4a7c-aae5-cbaaa9e22d45/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 712.710068] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73cbd45d-bad9-4052-8d64-8083bdad0224 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.720819] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e34d6500-5f14-408a-835e-b12b34fb375e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.752234] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ac6acd9-49ce-4d8a-aa81-13c1e946fab3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.759476] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a94caac-6d5a-4046-a223-6991577d52b6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.772550] env[67899]: DEBUG nova.compute.provider_tree [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 712.783008] env[67899]: DEBUG nova.scheduler.client.report [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 712.797320] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.547s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 712.797738] env[67899]: ERROR nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 712.797738] env[67899]: Faults: ['InvalidArgument'] [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Traceback (most recent call last): [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] self.driver.spawn(context, instance, image_meta, [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] self._fetch_image_if_missing(context, vi) [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] image_cache(vi, tmp_image_ds_loc) [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] vm_util.copy_virtual_disk( [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] session._wait_for_task(vmdk_copy_task) [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] return self.wait_for_task(task_ref) [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] return evt.wait() [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] result = hub.switch() [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] return self.greenlet.switch() [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] self.f(*self.args, **self.kw) [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] raise exceptions.translate_fault(task_info.error) [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Faults: ['InvalidArgument'] [ 712.797738] env[67899]: ERROR nova.compute.manager [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] [ 712.799022] env[67899]: DEBUG nova.compute.utils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 712.800343] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Build of instance 267a1016-410e-4097-9523-6fcafc5f4eb0 was re-scheduled: A specified parameter was not correct: fileType [ 712.800343] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 712.800722] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 712.800919] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 712.801056] env[67899]: DEBUG nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 712.801280] env[67899]: DEBUG nova.network.neutron [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 713.148743] env[67899]: DEBUG nova.network.neutron [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.162425] env[67899]: INFO nova.compute.manager [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] [instance: 267a1016-410e-4097-9523-6fcafc5f4eb0] Took 0.36 seconds to deallocate network for instance. [ 713.270498] env[67899]: INFO nova.scheduler.client.report [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Deleted allocations for instance 267a1016-410e-4097-9523-6fcafc5f4eb0 [ 713.298154] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93795dd8-69ab-4791-b400-f6f8ae0cd063 tempest-ServerExternalEventsTest-552313903 tempest-ServerExternalEventsTest-552313903-project-member] Lock "267a1016-410e-4097-9523-6fcafc5f4eb0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 111.619s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 713.324986] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 713.377671] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 713.377839] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 713.379692] env[67899]: INFO nova.compute.claims [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 713.786724] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df863479-bdd0-413e-80f0-6581b65825a9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.794550] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5f4e97e-58f7-4352-9507-d38d27bd2ef3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.828466] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78a5366b-77bc-459c-a1ed-766a594a866c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.836274] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7279cea6-505e-4c34-98ab-4789b97d8651 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.849325] env[67899]: DEBUG nova.compute.provider_tree [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.857562] env[67899]: DEBUG nova.scheduler.client.report [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.871721] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.494s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 713.872238] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 713.908134] env[67899]: DEBUG nova.compute.utils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 713.909997] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Not allocating networking since 'none' was specified. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 713.918145] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 713.984107] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 714.011971] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 714.012237] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 714.012396] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 714.012577] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 714.012719] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 714.012862] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 714.013076] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 714.013236] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 714.013402] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 714.013565] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 714.013737] env[67899]: DEBUG nova.virt.hardware [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 714.015112] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-806b8ed1-3fc2-4b23-a3d8-3830cbfdf362 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.022503] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a4eaac3-92a5-4816-abc0-84fac851d486 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.035879] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Instance VIF info [] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 714.041423] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Creating folder: Project (d002832800ac49acb37cadbb4d56acdf). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 714.041672] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c25b01a0-dc81-4229-96e7-4d0337fc633f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.050865] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Created folder: Project (d002832800ac49acb37cadbb4d56acdf) in parent group-v692900. [ 714.051061] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Creating folder: Instances. Parent ref: group-v692938. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 714.051303] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5332f317-f927-435f-b9e6-ed915ac9c0da {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.060594] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Created folder: Instances in parent group-v692938. [ 714.060818] env[67899]: DEBUG oslo.service.loopingcall [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 714.060995] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 714.061226] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f29806f9-380c-45bf-ba49-48a3e4c0d457 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.077645] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 714.077645] env[67899]: value = "task-3467865" [ 714.077645] env[67899]: _type = "Task" [ 714.077645] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 714.084534] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467865, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 714.587723] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467865, 'name': CreateVM_Task, 'duration_secs': 0.259077} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 714.588052] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 714.588311] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 714.588469] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 714.588767] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 714.589010] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d7b5c029-7363-4cb0-9e4c-f26d74b6d72c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.593544] env[67899]: DEBUG oslo_vmware.api [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Waiting for the task: (returnval){ [ 714.593544] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5278f3e0-362b-5079-b153-eedb2ad98f76" [ 714.593544] env[67899]: _type = "Task" [ 714.593544] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 714.602183] env[67899]: DEBUG oslo_vmware.api [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5278f3e0-362b-5079-b153-eedb2ad98f76, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 715.103805] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 715.104079] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 715.104290] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 718.792646] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 718.793174] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 731.998491] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 731.998491] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 731.998491] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 732.019026] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.019026] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.019724] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.019724] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.019724] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.019875] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.019933] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.020067] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.020189] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.020307] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.020427] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 732.020926] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.032970] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.033192] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.033358] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 732.033514] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 732.034610] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f5df47d-988d-4d98-b15c-fd542d317c48 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.043512] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc3b1a94-715f-4f2d-9786-bedd428f7590 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.057424] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeecb579-6883-4fed-bf26-45567f224c6b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.063642] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b240954-cdb2-48aa-ab95-ca80cf945064 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.093670] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180929MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 732.093801] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.093968] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.169934] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 91d5024f-9eac-4a56-b08f-c0f6a7eda775 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.170168] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.170340] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 195a4a1e-3da7-4a69-a679-869346368195 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.170503] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a19bcfd-5544-4688-8edb-e12c567979ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.170812] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 84cbacaa-08d2-4297-8777-150f433e4c04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.170812] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c29ae4c5-cc93-480c-8d60-96f6acba4346 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.170911] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.171012] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 913c5652-c8af-41a8-94f1-c0eba08aacdd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.171130] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.171243] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 732.185253] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.196335] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.206648] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 55dfe829-2e96-40d7-bef8-8e7556cbdab3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.216618] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f935bfef-3ca7-41fc-89be-c5c4e070a401 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.225899] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 66d3ec66-244d-4ffa-bd6f-7067f8955e67 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.236957] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bbed830d-aa53-4ea4-93f8-d4b198a333cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.246716] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 786676eb-ac36-48f6-874a-ab1ca15f2a9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.256374] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4cd5b80b-d1f4-4142-83fb-235523464667 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.265526] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 1589188b-8540-4afd-8050-ab47633593c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.274642] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance de1d5572-b82d-4edc-9c7e-a7e26c45a090 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.283400] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.293734] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e42425fa-6c50-4e76-842b-0bfcccb011c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.301803] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 96d79732-9076-4715-aa1e-60001ffb17fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.310434] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f7888060-430b-4b16-b9ca-059020615dee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.318950] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 1505bcf5-f622-40ee-93c2-8dabf1dce8cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.328675] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c281caa-f99d-40d5-b004-13e7856a29f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.338341] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance aa6229be-c18c-4cf9-99a1-ca546b30d797 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.348630] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 862297c3-0b85-43eb-b364-303bb0c0b077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.357722] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 641b8e97-b9e6-4ef0-a819-42d3a29429de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.366545] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance d0ceaa4e-9c87-48de-bcc2-8bb537827c0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.376169] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9842d097-f4f2-4f60-aea0-08896a47ff53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.385923] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 732.385923] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 732.386097] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 732.766020] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87bd196e-5ef6-469a-8844-e50df8cf7341 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.774145] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-186e4f71-7765-44c0-b369-892ee0353821 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.805552] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-280e1551-fa9c-4adb-81b8-87d5efb9d498 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.813429] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c6be0d2-360b-4973-ac5d-8e1b13ade0b8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.827101] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.836490] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 732.852712] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 732.852906] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.759s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 734.828566] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 734.828831] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 734.829110] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 734.829407] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 734.996479] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 734.996479] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 734.996595] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 734.996730] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 761.134897] env[67899]: WARNING oslo_vmware.rw_handles [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 761.134897] env[67899]: ERROR oslo_vmware.rw_handles [ 761.135386] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/2946308a-d2d4-4a7c-aae5-cbaaa9e22d45/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 761.137675] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 761.138066] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Copying Virtual Disk [datastore1] vmware_temp/2946308a-d2d4-4a7c-aae5-cbaaa9e22d45/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/2946308a-d2d4-4a7c-aae5-cbaaa9e22d45/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 761.138374] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e3b4478b-f37a-4d01-9d30-d176f3bb19b1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.147767] env[67899]: DEBUG oslo_vmware.api [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Waiting for the task: (returnval){ [ 761.147767] env[67899]: value = "task-3467866" [ 761.147767] env[67899]: _type = "Task" [ 761.147767] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 761.158488] env[67899]: DEBUG oslo_vmware.api [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Task: {'id': task-3467866, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 761.659832] env[67899]: DEBUG oslo_vmware.exceptions [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 761.660149] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 761.660719] env[67899]: ERROR nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 761.660719] env[67899]: Faults: ['InvalidArgument'] [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Traceback (most recent call last): [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] yield resources [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] self.driver.spawn(context, instance, image_meta, [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] self._fetch_image_if_missing(context, vi) [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] image_cache(vi, tmp_image_ds_loc) [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] vm_util.copy_virtual_disk( [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] session._wait_for_task(vmdk_copy_task) [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] return self.wait_for_task(task_ref) [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] return evt.wait() [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] result = hub.switch() [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] return self.greenlet.switch() [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] self.f(*self.args, **self.kw) [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] raise exceptions.translate_fault(task_info.error) [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Faults: ['InvalidArgument'] [ 761.660719] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] [ 761.661717] env[67899]: INFO nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Terminating instance [ 761.663031] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 761.663031] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 761.663158] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-661e6581-586e-4df4-9d8c-1799ebe13bd0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.666594] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 761.666808] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 761.667569] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-079ddaf1-bcef-4d41-8094-b857fa86e79b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.672175] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 761.672175] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 761.672658] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-870f307f-95f5-490c-aa29-3da7b248a4ce {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.676609] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 761.677128] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b019d28c-2c3c-4394-9d25-881f6b47cc65 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.679457] env[67899]: DEBUG oslo_vmware.api [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Waiting for the task: (returnval){ [ 761.679457] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52fa5512-5a19-cf0a-ce7e-47db58828912" [ 761.679457] env[67899]: _type = "Task" [ 761.679457] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 761.693117] env[67899]: DEBUG oslo_vmware.api [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52fa5512-5a19-cf0a-ce7e-47db58828912, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 762.190466] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 762.190728] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Creating directory with path [datastore1] vmware_temp/97d47aa5-cb34-4f15-83e4-b89c33f8fb07/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 762.190955] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-eefcbb5b-30c6-4c08-8cd0-9e5fcbc758b2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.211367] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Created directory with path [datastore1] vmware_temp/97d47aa5-cb34-4f15-83e4-b89c33f8fb07/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 762.211562] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Fetch image to [datastore1] vmware_temp/97d47aa5-cb34-4f15-83e4-b89c33f8fb07/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 762.211758] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/97d47aa5-cb34-4f15-83e4-b89c33f8fb07/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 762.212526] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5584ba2-9ce1-4dca-94f7-68435326172d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.219400] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-becbc960-c5c9-424a-bf9f-0f4e54cc32b1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.230134] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e91e3ebc-a481-40a4-a092-fdf1ed2c5099 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.234918] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 762.235256] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 762.235537] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Deleting the datastore file [datastore1] 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 762.235852] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e254de21-ae9c-45dd-9a0b-d54022008340 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.264121] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4658a1d-ed49-4fc1-82d1-02622618fe6b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.267231] env[67899]: DEBUG oslo_vmware.api [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Waiting for the task: (returnval){ [ 762.267231] env[67899]: value = "task-3467868" [ 762.267231] env[67899]: _type = "Task" [ 762.267231] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 762.272668] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cfdd7900-270f-4ba2-88ec-2970ced41301 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.277949] env[67899]: DEBUG oslo_vmware.api [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Task: {'id': task-3467868, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 762.304051] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 762.359329] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/97d47aa5-cb34-4f15-83e4-b89c33f8fb07/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 762.418606] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 762.418606] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/97d47aa5-cb34-4f15-83e4-b89c33f8fb07/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 762.777393] env[67899]: DEBUG oslo_vmware.api [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Task: {'id': task-3467868, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.086629} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 762.777650] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 762.777835] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 762.778009] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 762.778269] env[67899]: INFO nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Took 1.11 seconds to destroy the instance on the hypervisor. [ 762.780516] env[67899]: DEBUG nova.compute.claims [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 762.780689] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 762.780902] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 763.202159] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81bf4584-7e62-4736-8153-45c6da2101f7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.210702] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24a51fcc-1779-4adf-8b19-8c70606319d9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.239779] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32492e36-0b2d-4095-b0b4-35e146e6bd5b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.246960] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c56a7790-5e42-4e3a-849b-6b6fb006392c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.259903] env[67899]: DEBUG nova.compute.provider_tree [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 763.268373] env[67899]: DEBUG nova.scheduler.client.report [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 763.284251] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.503s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 763.286333] env[67899]: ERROR nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 763.286333] env[67899]: Faults: ['InvalidArgument'] [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Traceback (most recent call last): [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] self.driver.spawn(context, instance, image_meta, [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] self._fetch_image_if_missing(context, vi) [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] image_cache(vi, tmp_image_ds_loc) [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] vm_util.copy_virtual_disk( [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] session._wait_for_task(vmdk_copy_task) [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] return self.wait_for_task(task_ref) [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] return evt.wait() [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] result = hub.switch() [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] return self.greenlet.switch() [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] self.f(*self.args, **self.kw) [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] raise exceptions.translate_fault(task_info.error) [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Faults: ['InvalidArgument'] [ 763.286333] env[67899]: ERROR nova.compute.manager [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] [ 763.286333] env[67899]: DEBUG nova.compute.utils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 763.287468] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Build of instance 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb was re-scheduled: A specified parameter was not correct: fileType [ 763.287468] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 763.287468] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 763.287468] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 763.287657] env[67899]: DEBUG nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 763.287792] env[67899]: DEBUG nova.network.neutron [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 763.676793] env[67899]: DEBUG nova.network.neutron [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.688642] env[67899]: INFO nova.compute.manager [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] [instance: 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb] Took 0.40 seconds to deallocate network for instance. [ 763.797389] env[67899]: INFO nova.scheduler.client.report [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Deleted allocations for instance 7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb [ 763.818774] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f50326ac-209f-4dee-b08b-0e1e2789d90c tempest-ServerDiagnosticsNegativeTest-286286993 tempest-ServerDiagnosticsNegativeTest-286286993-project-member] Lock "7c092e03-5f1d-4e4a-98ca-8e6abddb3cdb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 161.879s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 763.833695] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 763.889849] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 763.890126] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 763.891788] env[67899]: INFO nova.compute.claims [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 764.323743] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56e4a54f-b8d5-4260-bc1f-90b5e7a8e2e8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.331426] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f53739d4-669e-4356-a9bd-a713c2bdb55d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.360578] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f32be07b-2b24-4b71-a43a-892b3eea270d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.367850] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-487010b9-ba51-4fb2-86e9-b9aeb581e549 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.380735] env[67899]: DEBUG nova.compute.provider_tree [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 764.389269] env[67899]: DEBUG nova.scheduler.client.report [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 764.405389] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.515s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 764.405783] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 764.438113] env[67899]: DEBUG nova.compute.utils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 764.440167] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 764.440167] env[67899]: DEBUG nova.network.neutron [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 764.449955] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 764.514490] env[67899]: DEBUG nova.policy [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5206226ca404a07b10db199a6436504', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bdf895619b34412fb20488318e170d23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 764.517577] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 764.544901] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 764.545159] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 764.545312] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 764.545501] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 764.545647] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 764.545910] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 764.546102] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 764.546279] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 764.546462] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 764.546620] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 764.546787] env[67899]: DEBUG nova.virt.hardware [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 764.547921] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-887b0208-5b56-48cd-a615-a46ee9bfc6e5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.556460] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2423a2f2-33b7-4074-b395-3621474ae55e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.943743] env[67899]: DEBUG nova.network.neutron [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Successfully created port: 5a982426-e61b-4c12-b3e9-09765f167e97 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 766.010312] env[67899]: DEBUG nova.network.neutron [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Successfully updated port: 5a982426-e61b-4c12-b3e9-09765f167e97 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 766.044585] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "refresh_cache-4458efe7-18d4-4cfb-b131-e09d36124d68" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 766.044744] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "refresh_cache-4458efe7-18d4-4cfb-b131-e09d36124d68" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 766.044898] env[67899]: DEBUG nova.network.neutron [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 766.100380] env[67899]: DEBUG nova.network.neutron [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 766.249736] env[67899]: DEBUG nova.compute.manager [req-f031d487-8b6b-473a-a94b-c10d371702f0 req-7f3fc851-5053-4183-a715-79f81072194c service nova] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Received event network-vif-plugged-5a982426-e61b-4c12-b3e9-09765f167e97 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 766.250029] env[67899]: DEBUG oslo_concurrency.lockutils [req-f031d487-8b6b-473a-a94b-c10d371702f0 req-7f3fc851-5053-4183-a715-79f81072194c service nova] Acquiring lock "4458efe7-18d4-4cfb-b131-e09d36124d68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 766.250638] env[67899]: DEBUG oslo_concurrency.lockutils [req-f031d487-8b6b-473a-a94b-c10d371702f0 req-7f3fc851-5053-4183-a715-79f81072194c service nova] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 766.250773] env[67899]: DEBUG oslo_concurrency.lockutils [req-f031d487-8b6b-473a-a94b-c10d371702f0 req-7f3fc851-5053-4183-a715-79f81072194c service nova] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 766.250942] env[67899]: DEBUG nova.compute.manager [req-f031d487-8b6b-473a-a94b-c10d371702f0 req-7f3fc851-5053-4183-a715-79f81072194c service nova] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] No waiting events found dispatching network-vif-plugged-5a982426-e61b-4c12-b3e9-09765f167e97 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 766.251102] env[67899]: WARNING nova.compute.manager [req-f031d487-8b6b-473a-a94b-c10d371702f0 req-7f3fc851-5053-4183-a715-79f81072194c service nova] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Received unexpected event network-vif-plugged-5a982426-e61b-4c12-b3e9-09765f167e97 for instance with vm_state building and task_state spawning. [ 766.344264] env[67899]: DEBUG nova.network.neutron [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Updating instance_info_cache with network_info: [{"id": "5a982426-e61b-4c12-b3e9-09765f167e97", "address": "fa:16:3e:de:8c:89", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5a982426-e6", "ovs_interfaceid": "5a982426-e61b-4c12-b3e9-09765f167e97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 766.360765] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "refresh_cache-4458efe7-18d4-4cfb-b131-e09d36124d68" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 766.361087] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Instance network_info: |[{"id": "5a982426-e61b-4c12-b3e9-09765f167e97", "address": "fa:16:3e:de:8c:89", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5a982426-e6", "ovs_interfaceid": "5a982426-e61b-4c12-b3e9-09765f167e97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 766.361491] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:de:8c:89', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '357d2811-e990-4985-9f9e-b158d10d3699', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5a982426-e61b-4c12-b3e9-09765f167e97', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 766.370048] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating folder: Project (bdf895619b34412fb20488318e170d23). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 766.371008] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d790b2df-b0f7-401a-973f-57a256e50bba {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.383767] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Created folder: Project (bdf895619b34412fb20488318e170d23) in parent group-v692900. [ 766.384038] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating folder: Instances. Parent ref: group-v692941. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 766.384296] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ebf2fbce-fdc5-478f-86d2-6aaa7cdde039 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.395469] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Created folder: Instances in parent group-v692941. [ 766.395710] env[67899]: DEBUG oslo.service.loopingcall [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 766.395908] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 766.396129] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ad9fab27-015c-48cd-bad1-ccd21c965d12 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.418388] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 766.418388] env[67899]: value = "task-3467871" [ 766.418388] env[67899]: _type = "Task" [ 766.418388] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 766.426299] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467871, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 766.930812] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467871, 'name': CreateVM_Task, 'duration_secs': 0.312728} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 766.930985] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 766.931673] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 766.931846] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 766.932169] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 766.932613] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8c60fb40-62c6-4ab9-a29e-93c781dc9a73 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 766.937202] env[67899]: DEBUG oslo_vmware.api [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 766.937202] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52abb147-ede0-513c-3d1f-a4b5b49939cf" [ 766.937202] env[67899]: _type = "Task" [ 766.937202] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 766.950824] env[67899]: DEBUG oslo_vmware.api [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52abb147-ede0-513c-3d1f-a4b5b49939cf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 767.448165] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 767.448455] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 767.448651] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 768.454770] env[67899]: DEBUG nova.compute.manager [req-c4850137-f721-40a8-bd0e-08b52fd02ea7 req-1be4b94d-b490-4d4c-8298-f2240ae2f3bb service nova] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Received event network-changed-5a982426-e61b-4c12-b3e9-09765f167e97 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 768.455031] env[67899]: DEBUG nova.compute.manager [req-c4850137-f721-40a8-bd0e-08b52fd02ea7 req-1be4b94d-b490-4d4c-8298-f2240ae2f3bb service nova] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Refreshing instance network info cache due to event network-changed-5a982426-e61b-4c12-b3e9-09765f167e97. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 768.455182] env[67899]: DEBUG oslo_concurrency.lockutils [req-c4850137-f721-40a8-bd0e-08b52fd02ea7 req-1be4b94d-b490-4d4c-8298-f2240ae2f3bb service nova] Acquiring lock "refresh_cache-4458efe7-18d4-4cfb-b131-e09d36124d68" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 768.455322] env[67899]: DEBUG oslo_concurrency.lockutils [req-c4850137-f721-40a8-bd0e-08b52fd02ea7 req-1be4b94d-b490-4d4c-8298-f2240ae2f3bb service nova] Acquired lock "refresh_cache-4458efe7-18d4-4cfb-b131-e09d36124d68" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 768.455484] env[67899]: DEBUG nova.network.neutron [req-c4850137-f721-40a8-bd0e-08b52fd02ea7 req-1be4b94d-b490-4d4c-8298-f2240ae2f3bb service nova] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Refreshing network info cache for port 5a982426-e61b-4c12-b3e9-09765f167e97 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 768.843064] env[67899]: DEBUG nova.network.neutron [req-c4850137-f721-40a8-bd0e-08b52fd02ea7 req-1be4b94d-b490-4d4c-8298-f2240ae2f3bb service nova] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Updated VIF entry in instance network info cache for port 5a982426-e61b-4c12-b3e9-09765f167e97. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 768.843813] env[67899]: DEBUG nova.network.neutron [req-c4850137-f721-40a8-bd0e-08b52fd02ea7 req-1be4b94d-b490-4d4c-8298-f2240ae2f3bb service nova] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Updating instance_info_cache with network_info: [{"id": "5a982426-e61b-4c12-b3e9-09765f167e97", "address": "fa:16:3e:de:8c:89", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5a982426-e6", "ovs_interfaceid": "5a982426-e61b-4c12-b3e9-09765f167e97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 768.869639] env[67899]: DEBUG oslo_concurrency.lockutils [req-c4850137-f721-40a8-bd0e-08b52fd02ea7 req-1be4b94d-b490-4d4c-8298-f2240ae2f3bb service nova] Releasing lock "refresh_cache-4458efe7-18d4-4cfb-b131-e09d36124d68" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 769.937232] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquiring lock "37ab08db-50ab-4c30-9e18-05007c5d1c27" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 769.937535] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.995623] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 792.019368] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 792.031815] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 792.032224] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 792.032571] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.033514] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 792.034207] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c76c6333-5467-4f96-a767-6d195022389d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.042741] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11ecbac9-5149-4a73-bd08-819578999fdd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.056353] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb8f9004-7c86-4570-8d35-7e089c4e3311 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.062675] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8cc80a4-82ec-4744-8c9d-7b4c925bb964 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.093253] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180946MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 792.093403] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 792.093594] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 792.165323] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 91d5024f-9eac-4a56-b08f-c0f6a7eda775 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.165504] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 195a4a1e-3da7-4a69-a679-869346368195 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.165650] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a19bcfd-5544-4688-8edb-e12c567979ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.165781] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 84cbacaa-08d2-4297-8777-150f433e4c04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.165902] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c29ae4c5-cc93-480c-8d60-96f6acba4346 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.166033] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.166154] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 913c5652-c8af-41a8-94f1-c0eba08aacdd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.166271] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.166385] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.166521] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 792.177291] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.188464] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 55dfe829-2e96-40d7-bef8-8e7556cbdab3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.198515] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f935bfef-3ca7-41fc-89be-c5c4e070a401 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.208974] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 66d3ec66-244d-4ffa-bd6f-7067f8955e67 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.218425] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bbed830d-aa53-4ea4-93f8-d4b198a333cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.228785] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 786676eb-ac36-48f6-874a-ab1ca15f2a9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.243048] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4cd5b80b-d1f4-4142-83fb-235523464667 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.252434] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 1589188b-8540-4afd-8050-ab47633593c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.262272] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance de1d5572-b82d-4edc-9c7e-a7e26c45a090 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.271926] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.282312] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e42425fa-6c50-4e76-842b-0bfcccb011c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.293466] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 96d79732-9076-4715-aa1e-60001ffb17fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.303535] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f7888060-430b-4b16-b9ca-059020615dee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.313360] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 1505bcf5-f622-40ee-93c2-8dabf1dce8cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.324087] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c281caa-f99d-40d5-b004-13e7856a29f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.335365] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance aa6229be-c18c-4cf9-99a1-ca546b30d797 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.346947] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 862297c3-0b85-43eb-b364-303bb0c0b077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.356527] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 641b8e97-b9e6-4ef0-a819-42d3a29429de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.366294] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance d0ceaa4e-9c87-48de-bcc2-8bb537827c0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.378335] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9842d097-f4f2-4f60-aea0-08896a47ff53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.388944] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.399241] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 792.399527] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 792.399701] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 792.763057] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cbc3027-38b1-4541-a5d5-2ece728c9165 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.770716] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3818886-b634-45cd-8428-0c697e6532d7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.800584] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb6d84d4-1066-4ebe-b260-2cd7b994fd1a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.807396] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-140aedba-f4ec-4f1b-9f17-08b0e1b02106 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.820358] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 792.829645] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 792.843841] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 792.844044] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.750s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 794.822620] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 794.822881] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 794.823522] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 794.844029] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.844029] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.844215] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.844292] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.844404] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.844526] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.844639] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.844767] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.844872] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.844990] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 794.845122] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 794.996265] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 794.996399] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 794.996563] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 795.992150] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 795.995750] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 795.995936] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 795.996996] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 796.996573] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 799.354605] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquiring lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 802.308168] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "195a4a1e-3da7-4a69-a679-869346368195" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 808.618900] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquiring lock "7a19bcfd-5544-4688-8edb-e12c567979ae" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 808.940947] env[67899]: WARNING oslo_vmware.rw_handles [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 808.940947] env[67899]: ERROR oslo_vmware.rw_handles [ 808.941388] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/97d47aa5-cb34-4f15-83e4-b89c33f8fb07/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 808.942987] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 808.943258] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Copying Virtual Disk [datastore1] vmware_temp/97d47aa5-cb34-4f15-83e4-b89c33f8fb07/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/97d47aa5-cb34-4f15-83e4-b89c33f8fb07/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 808.943543] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-185f886d-84ea-4897-bf45-9c7cb49e8fa3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.952145] env[67899]: DEBUG oslo_vmware.api [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Waiting for the task: (returnval){ [ 808.952145] env[67899]: value = "task-3467872" [ 808.952145] env[67899]: _type = "Task" [ 808.952145] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 808.960843] env[67899]: DEBUG oslo_vmware.api [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Task: {'id': task-3467872, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 809.462613] env[67899]: DEBUG oslo_vmware.exceptions [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 809.462915] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 809.463494] env[67899]: ERROR nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 809.463494] env[67899]: Faults: ['InvalidArgument'] [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Traceback (most recent call last): [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] yield resources [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] self.driver.spawn(context, instance, image_meta, [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] self._vmops.spawn(context, instance, image_meta, injected_files, [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] self._fetch_image_if_missing(context, vi) [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] image_cache(vi, tmp_image_ds_loc) [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] vm_util.copy_virtual_disk( [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] session._wait_for_task(vmdk_copy_task) [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] return self.wait_for_task(task_ref) [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] return evt.wait() [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] result = hub.switch() [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] return self.greenlet.switch() [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] self.f(*self.args, **self.kw) [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] raise exceptions.translate_fault(task_info.error) [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Faults: ['InvalidArgument'] [ 809.463494] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] [ 809.464993] env[67899]: INFO nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Terminating instance [ 809.465367] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 809.465625] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 809.466197] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 809.466388] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 809.466610] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf282d87-7abc-438b-b088-8a8cefde5014 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.468984] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6efebc4-8c8c-4625-b5a0-4a17223ce827 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.475725] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 809.475938] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-78445a52-0937-4794-981a-6d84671c932b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.478139] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 809.478309] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 809.479270] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1b7b84ce-2278-42ca-8c10-c9e799f60f0e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.484454] env[67899]: DEBUG oslo_vmware.api [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 809.484454] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52c212ea-b82f-c55b-3934-4effa365f603" [ 809.484454] env[67899]: _type = "Task" [ 809.484454] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 809.491947] env[67899]: DEBUG oslo_vmware.api [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52c212ea-b82f-c55b-3934-4effa365f603, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 809.541725] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 809.541946] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 809.542125] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Deleting the datastore file [datastore1] 91d5024f-9eac-4a56-b08f-c0f6a7eda775 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 809.542391] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-90ee50f3-181b-4a45-a48b-a671cdf16399 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.548256] env[67899]: DEBUG oslo_vmware.api [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Waiting for the task: (returnval){ [ 809.548256] env[67899]: value = "task-3467874" [ 809.548256] env[67899]: _type = "Task" [ 809.548256] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 809.555690] env[67899]: DEBUG oslo_vmware.api [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Task: {'id': task-3467874, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 809.994607] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 809.994869] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating directory with path [datastore1] vmware_temp/39119767-0f2e-4dbc-9d62-614a98d8011f/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 809.995113] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9ce55eda-a2cf-4009-9405-a0496ccdc5bd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.006799] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Created directory with path [datastore1] vmware_temp/39119767-0f2e-4dbc-9d62-614a98d8011f/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 810.006987] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Fetch image to [datastore1] vmware_temp/39119767-0f2e-4dbc-9d62-614a98d8011f/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 810.007186] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/39119767-0f2e-4dbc-9d62-614a98d8011f/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 810.007906] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d06b929-49fc-4a19-a1b8-ac5077823713 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.014736] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e316b09-39f1-488d-9f01-434593d46b09 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.024029] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84c267b9-85ca-413b-a6d4-aeaf4df0911f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.056728] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70a3cc16-213b-43e4-9777-5a9df34cd0bd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.063363] env[67899]: DEBUG oslo_vmware.api [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Task: {'id': task-3467874, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079867} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 810.064704] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 810.064898] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 810.065079] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 810.065254] env[67899]: INFO nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Took 0.60 seconds to destroy the instance on the hypervisor. [ 810.066936] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a0f6a577-984c-499c-a969-e4373d245ef7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.068981] env[67899]: DEBUG nova.compute.claims [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 810.069164] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.069371] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 810.090290] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 810.150783] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/39119767-0f2e-4dbc-9d62-614a98d8011f/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 810.209791] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 810.210019] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/39119767-0f2e-4dbc-9d62-614a98d8011f/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 810.539226] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02a8222a-1340-404d-9afd-a8206287721f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.546873] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67add9d7-910c-42ba-856e-3875bbbe8e3f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.576689] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1aeedc2-ef9c-45fd-b01e-e866a8b0eef4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.583903] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ab7684a-ba21-4cf9-a3c8-d5ea5d00c46f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.596833] env[67899]: DEBUG nova.compute.provider_tree [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 810.605659] env[67899]: DEBUG nova.scheduler.client.report [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 810.620860] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.551s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 810.621460] env[67899]: ERROR nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 810.621460] env[67899]: Faults: ['InvalidArgument'] [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Traceback (most recent call last): [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] self.driver.spawn(context, instance, image_meta, [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] self._vmops.spawn(context, instance, image_meta, injected_files, [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] self._fetch_image_if_missing(context, vi) [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] image_cache(vi, tmp_image_ds_loc) [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] vm_util.copy_virtual_disk( [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] session._wait_for_task(vmdk_copy_task) [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] return self.wait_for_task(task_ref) [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] return evt.wait() [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] result = hub.switch() [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] return self.greenlet.switch() [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] self.f(*self.args, **self.kw) [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] raise exceptions.translate_fault(task_info.error) [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Faults: ['InvalidArgument'] [ 810.621460] env[67899]: ERROR nova.compute.manager [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] [ 810.622727] env[67899]: DEBUG nova.compute.utils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 810.623586] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Build of instance 91d5024f-9eac-4a56-b08f-c0f6a7eda775 was re-scheduled: A specified parameter was not correct: fileType [ 810.623586] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 810.623948] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 810.624134] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 810.624288] env[67899]: DEBUG nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 810.624451] env[67899]: DEBUG nova.network.neutron [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 811.179277] env[67899]: DEBUG nova.network.neutron [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 811.190793] env[67899]: INFO nova.compute.manager [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Took 0.57 seconds to deallocate network for instance. [ 811.299122] env[67899]: INFO nova.scheduler.client.report [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Deleted allocations for instance 91d5024f-9eac-4a56-b08f-c0f6a7eda775 [ 811.319061] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a1baea38-4329-40f0-8c18-ae18eafc1b0b tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.653s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 811.319061] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 11.963s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 811.319061] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Acquiring lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.319061] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 811.319061] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 811.320153] env[67899]: INFO nova.compute.manager [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Terminating instance [ 811.321963] env[67899]: DEBUG nova.compute.manager [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 811.322409] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 811.322670] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e45fded0-20e5-4f5d-94fb-12613cedd9d4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.331703] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b5af06a-7583-445a-8f55-13ea489af6a6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.342672] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 811.363720] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 91d5024f-9eac-4a56-b08f-c0f6a7eda775 could not be found. [ 811.363720] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 811.363917] env[67899]: INFO nova.compute.manager [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Took 0.04 seconds to destroy the instance on the hypervisor. [ 811.364059] env[67899]: DEBUG oslo.service.loopingcall [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 811.364842] env[67899]: DEBUG nova.compute.manager [-] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 811.364842] env[67899]: DEBUG nova.network.neutron [-] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 811.393922] env[67899]: DEBUG nova.network.neutron [-] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 811.401464] env[67899]: INFO nova.compute.manager [-] [instance: 91d5024f-9eac-4a56-b08f-c0f6a7eda775] Took 0.04 seconds to deallocate network for instance. [ 811.403622] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.403847] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 811.406020] env[67899]: INFO nova.compute.claims [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 811.525927] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1ad1fbbd-a859-4399-8171-00eb39c3bced tempest-ServersAdminNegativeTestJSON-448291398 tempest-ServersAdminNegativeTestJSON-448291398-project-member] Lock "91d5024f-9eac-4a56-b08f-c0f6a7eda775" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.208s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 811.824978] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-569765eb-d0d8-4d46-8bcc-ecf7e340f851 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.833355] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a534dfd-4620-4cdf-80df-e83bd1964cb6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.862637] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87685bdf-a797-4fc3-9e0d-74b93bd803d7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.869706] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf31b72b-2dd3-419e-9cc1-153a61b69335 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.883708] env[67899]: DEBUG nova.compute.provider_tree [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 811.894035] env[67899]: DEBUG nova.scheduler.client.report [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 811.911379] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.507s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 811.911594] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 811.954855] env[67899]: DEBUG nova.compute.utils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 811.960022] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 811.960022] env[67899]: DEBUG nova.network.neutron [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 811.965587] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 812.035420] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 812.061606] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:09:47Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='506754174',id=30,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-1983268426',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 812.061853] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 812.062020] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 812.062229] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 812.062366] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 812.062540] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 812.062753] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 812.062926] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 812.063126] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 812.063304] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 812.063472] env[67899]: DEBUG nova.virt.hardware [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 812.064372] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f60f4f70-8a71-47ab-88f4-5cfbd54b7847 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.072107] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d87dc68-1d74-4308-b9ed-a3f04fd55146 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.087508] env[67899]: DEBUG nova.policy [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5836929e7e4b46428afa2814a4bb099c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a8c0a55efd8416cb1e291da3dad237e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 812.654173] env[67899]: DEBUG nova.network.neutron [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Successfully created port: ca99ac7c-bb13-4419-a706-acf3d4ad3176 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 813.528502] env[67899]: DEBUG nova.compute.manager [req-6c531996-d544-454f-ab8b-dd8139bf2c98 req-84079a9a-7db2-43a9-a6ce-d3a81b30afce service nova] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Received event network-vif-plugged-ca99ac7c-bb13-4419-a706-acf3d4ad3176 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 813.528722] env[67899]: DEBUG oslo_concurrency.lockutils [req-6c531996-d544-454f-ab8b-dd8139bf2c98 req-84079a9a-7db2-43a9-a6ce-d3a81b30afce service nova] Acquiring lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 813.529281] env[67899]: DEBUG oslo_concurrency.lockutils [req-6c531996-d544-454f-ab8b-dd8139bf2c98 req-84079a9a-7db2-43a9-a6ce-d3a81b30afce service nova] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 813.529469] env[67899]: DEBUG oslo_concurrency.lockutils [req-6c531996-d544-454f-ab8b-dd8139bf2c98 req-84079a9a-7db2-43a9-a6ce-d3a81b30afce service nova] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 813.529622] env[67899]: DEBUG nova.compute.manager [req-6c531996-d544-454f-ab8b-dd8139bf2c98 req-84079a9a-7db2-43a9-a6ce-d3a81b30afce service nova] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] No waiting events found dispatching network-vif-plugged-ca99ac7c-bb13-4419-a706-acf3d4ad3176 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 813.529790] env[67899]: WARNING nova.compute.manager [req-6c531996-d544-454f-ab8b-dd8139bf2c98 req-84079a9a-7db2-43a9-a6ce-d3a81b30afce service nova] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Received unexpected event network-vif-plugged-ca99ac7c-bb13-4419-a706-acf3d4ad3176 for instance with vm_state building and task_state spawning. [ 813.660629] env[67899]: DEBUG nova.network.neutron [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Successfully updated port: ca99ac7c-bb13-4419-a706-acf3d4ad3176 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 813.673185] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquiring lock "refresh_cache-35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 813.673185] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquired lock "refresh_cache-35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 813.673185] env[67899]: DEBUG nova.network.neutron [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 813.733349] env[67899]: DEBUG nova.network.neutron [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 813.960701] env[67899]: DEBUG nova.network.neutron [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Updating instance_info_cache with network_info: [{"id": "ca99ac7c-bb13-4419-a706-acf3d4ad3176", "address": "fa:16:3e:75:55:c4", "network": {"id": "12f3346f-55c2-4d93-a2f5-c978ad630416", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1907341002-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8a8c0a55efd8416cb1e291da3dad237e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33fdc099-7497-41c1-b40c-1558937132d4", "external-id": "nsx-vlan-transportzone-764", "segmentation_id": 764, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapca99ac7c-bb", "ovs_interfaceid": "ca99ac7c-bb13-4419-a706-acf3d4ad3176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 813.979115] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Releasing lock "refresh_cache-35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 813.979115] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Instance network_info: |[{"id": "ca99ac7c-bb13-4419-a706-acf3d4ad3176", "address": "fa:16:3e:75:55:c4", "network": {"id": "12f3346f-55c2-4d93-a2f5-c978ad630416", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1907341002-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8a8c0a55efd8416cb1e291da3dad237e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33fdc099-7497-41c1-b40c-1558937132d4", "external-id": "nsx-vlan-transportzone-764", "segmentation_id": 764, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapca99ac7c-bb", "ovs_interfaceid": "ca99ac7c-bb13-4419-a706-acf3d4ad3176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 813.979115] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:75:55:c4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '33fdc099-7497-41c1-b40c-1558937132d4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ca99ac7c-bb13-4419-a706-acf3d4ad3176', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 813.985133] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Creating folder: Project (8a8c0a55efd8416cb1e291da3dad237e). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 813.985828] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4b094675-690a-42c6-a037-313b3937f11f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.999069] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Created folder: Project (8a8c0a55efd8416cb1e291da3dad237e) in parent group-v692900. [ 813.999424] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Creating folder: Instances. Parent ref: group-v692944. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 813.999749] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-14f08b8a-7965-4d2e-97de-a85c53da7325 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.008286] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Created folder: Instances in parent group-v692944. [ 814.008687] env[67899]: DEBUG oslo.service.loopingcall [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 814.008981] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 814.009295] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f082bd52-bf6e-4189-8351-2c9e703fe583 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.032186] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 814.032186] env[67899]: value = "task-3467877" [ 814.032186] env[67899]: _type = "Task" [ 814.032186] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 814.038206] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467877, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 814.541933] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467877, 'name': CreateVM_Task, 'duration_secs': 0.391905} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 814.541933] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 814.542799] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 814.542991] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 814.543381] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 814.543637] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e075da89-052c-4af9-b6cf-cda54fb2c09a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.548441] env[67899]: DEBUG oslo_vmware.api [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Waiting for the task: (returnval){ [ 814.548441] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]520f68d1-80d0-cfa6-8b81-73bf38c7470c" [ 814.548441] env[67899]: _type = "Task" [ 814.548441] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 814.559935] env[67899]: DEBUG oslo_vmware.api [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]520f68d1-80d0-cfa6-8b81-73bf38c7470c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 815.058909] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 815.059265] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 815.059402] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 815.667242] env[67899]: DEBUG nova.compute.manager [req-63d28505-c31f-479e-ba1d-933571c8c2eb req-a48d4a67-fdaf-450d-ae06-35950fb07ba5 service nova] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Received event network-changed-ca99ac7c-bb13-4419-a706-acf3d4ad3176 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 815.668082] env[67899]: DEBUG nova.compute.manager [req-63d28505-c31f-479e-ba1d-933571c8c2eb req-a48d4a67-fdaf-450d-ae06-35950fb07ba5 service nova] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Refreshing instance network info cache due to event network-changed-ca99ac7c-bb13-4419-a706-acf3d4ad3176. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 815.668082] env[67899]: DEBUG oslo_concurrency.lockutils [req-63d28505-c31f-479e-ba1d-933571c8c2eb req-a48d4a67-fdaf-450d-ae06-35950fb07ba5 service nova] Acquiring lock "refresh_cache-35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 815.668082] env[67899]: DEBUG oslo_concurrency.lockutils [req-63d28505-c31f-479e-ba1d-933571c8c2eb req-a48d4a67-fdaf-450d-ae06-35950fb07ba5 service nova] Acquired lock "refresh_cache-35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 815.668082] env[67899]: DEBUG nova.network.neutron [req-63d28505-c31f-479e-ba1d-933571c8c2eb req-a48d4a67-fdaf-450d-ae06-35950fb07ba5 service nova] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Refreshing network info cache for port ca99ac7c-bb13-4419-a706-acf3d4ad3176 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 816.295255] env[67899]: DEBUG nova.network.neutron [req-63d28505-c31f-479e-ba1d-933571c8c2eb req-a48d4a67-fdaf-450d-ae06-35950fb07ba5 service nova] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Updated VIF entry in instance network info cache for port ca99ac7c-bb13-4419-a706-acf3d4ad3176. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 816.295255] env[67899]: DEBUG nova.network.neutron [req-63d28505-c31f-479e-ba1d-933571c8c2eb req-a48d4a67-fdaf-450d-ae06-35950fb07ba5 service nova] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Updating instance_info_cache with network_info: [{"id": "ca99ac7c-bb13-4419-a706-acf3d4ad3176", "address": "fa:16:3e:75:55:c4", "network": {"id": "12f3346f-55c2-4d93-a2f5-c978ad630416", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1907341002-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8a8c0a55efd8416cb1e291da3dad237e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33fdc099-7497-41c1-b40c-1558937132d4", "external-id": "nsx-vlan-transportzone-764", "segmentation_id": 764, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapca99ac7c-bb", "ovs_interfaceid": "ca99ac7c-bb13-4419-a706-acf3d4ad3176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 816.307378] env[67899]: DEBUG oslo_concurrency.lockutils [req-63d28505-c31f-479e-ba1d-933571c8c2eb req-a48d4a67-fdaf-450d-ae06-35950fb07ba5 service nova] Releasing lock "refresh_cache-35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 818.161797] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 818.162590] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 818.190814] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "4cbf5a4d-9466-4bc6-adc9-973759545cf4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 818.191097] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "4cbf5a4d-9466-4bc6-adc9-973759545cf4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 818.214377] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "3fabbf48-5df3-4e36-a9d8-494c221304b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 818.214858] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "3fabbf48-5df3-4e36-a9d8-494c221304b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 820.435570] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquiring lock "84cbacaa-08d2-4297-8777-150f433e4c04" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 821.377993] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquiring lock "c29ae4c5-cc93-480c-8d60-96f6acba4346" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 825.636562] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquiring lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 827.907156] env[67899]: DEBUG oslo_concurrency.lockutils [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquiring lock "913c5652-c8af-41a8-94f1-c0eba08aacdd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 828.850051] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquiring lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 839.382278] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 843.518837] env[67899]: DEBUG oslo_concurrency.lockutils [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquiring lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 844.469383] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "4458efe7-18d4-4cfb-b131-e09d36124d68" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.119956] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2d2f0f85-9ccf-46fc-88e6-864a4167c672 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "7b6a4c60-1b40-44b8-b341-3dcaf1716c99" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.120657] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2d2f0f85-9ccf-46fc-88e6-864a4167c672 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "7b6a4c60-1b40-44b8-b341-3dcaf1716c99" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 851.998156] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 851.998156] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 852.014984] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] There are 0 instances to clean {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 852.015355] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 852.015516] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances with incomplete migration {{(pid=67899) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 852.028692] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 854.044839] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 854.062691] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 854.062691] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 854.062691] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 854.062691] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 854.062691] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a54041c-8e47-4ede-99eb-f2dd47912cd3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.073146] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d963fc32-6a13-4bc0-8080-1b83ceeb9ba2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.090458] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ade9cb7-575e-47d4-bc1d-f21ae9a9f4be {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.100572] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8eb3371-8452-46c7-afaf-c1633628fcba {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.137557] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180915MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 854.137821] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 854.137923] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 854.200591] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2eea373d-45cd-4959-b8b6-32ef12d95a25 tempest-ImagesOneServerNegativeTestJSON-1534430733 tempest-ImagesOneServerNegativeTestJSON-1534430733-project-member] Acquiring lock "183fd334-b0e1-479a-b38a-62f21c176d17" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 854.200845] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2eea373d-45cd-4959-b8b6-32ef12d95a25 tempest-ImagesOneServerNegativeTestJSON-1534430733 tempest-ImagesOneServerNegativeTestJSON-1534430733-project-member] Lock "183fd334-b0e1-479a-b38a-62f21c176d17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 854.233471] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 195a4a1e-3da7-4a69-a679-869346368195 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.233820] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a19bcfd-5544-4688-8edb-e12c567979ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.233820] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 84cbacaa-08d2-4297-8777-150f433e4c04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.233956] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c29ae4c5-cc93-480c-8d60-96f6acba4346 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.234049] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.234134] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 913c5652-c8af-41a8-94f1-c0eba08aacdd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.234258] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.234542] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.234542] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.234657] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 854.249590] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 786676eb-ac36-48f6-874a-ab1ca15f2a9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.268130] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4cd5b80b-d1f4-4142-83fb-235523464667 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.282966] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 1589188b-8540-4afd-8050-ab47633593c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.294882] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance de1d5572-b82d-4edc-9c7e-a7e26c45a090 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.310105] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.325653] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e42425fa-6c50-4e76-842b-0bfcccb011c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.339157] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 96d79732-9076-4715-aa1e-60001ffb17fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.353026] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f7888060-430b-4b16-b9ca-059020615dee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.364153] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 1505bcf5-f622-40ee-93c2-8dabf1dce8cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.398695] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c281caa-f99d-40d5-b004-13e7856a29f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.411356] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance aa6229be-c18c-4cf9-99a1-ca546b30d797 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.425741] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 862297c3-0b85-43eb-b364-303bb0c0b077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.438692] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 641b8e97-b9e6-4ef0-a819-42d3a29429de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.449154] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance d0ceaa4e-9c87-48de-bcc2-8bb537827c0a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.459672] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9842d097-f4f2-4f60-aea0-08896a47ff53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.469932] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.484756] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.495196] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.508893] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4cbf5a4d-9466-4bc6-adc9-973759545cf4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.519592] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3fabbf48-5df3-4e36-a9d8-494c221304b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.530614] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7b6a4c60-1b40-44b8-b341-3dcaf1716c99 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.541212] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 183fd334-b0e1-479a-b38a-62f21c176d17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 854.541734] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 854.541937] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 854.991096] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c2b1521-dac6-46bc-9de6-4f5dc93bcd84 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.998698] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eaf821c-e4a8-4116-93cf-ed687be85084 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.030456] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec0982b4-a6b3-48ef-82fc-875c7b372668 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.040414] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8e5ce42-b853-4a2b-8a97-6044f2a2ded3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.060045] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 855.091570] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 855.107873] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 855.108164] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.970s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 856.059887] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 856.060139] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 856.061450] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 856.061450] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 856.092522] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.092795] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.092936] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.093517] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.093517] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.093517] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.093517] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.093707] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.093707] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.093782] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 856.093902] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 856.094455] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 856.095738] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 856.996958] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 856.996958] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 857.997214] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 857.997214] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 857.997214] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 858.862446] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquiring lock "b9282eeb-09db-4138-a1f0-9e03828021b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 858.862680] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 858.954588] env[67899]: WARNING oslo_vmware.rw_handles [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 858.954588] env[67899]: ERROR oslo_vmware.rw_handles [ 858.954998] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/39119767-0f2e-4dbc-9d62-614a98d8011f/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 858.956682] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 858.956923] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Copying Virtual Disk [datastore1] vmware_temp/39119767-0f2e-4dbc-9d62-614a98d8011f/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/39119767-0f2e-4dbc-9d62-614a98d8011f/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 858.957818] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4106f9f3-4fe8-4e3c-bb75-95cf7f36e408 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.966299] env[67899]: DEBUG oslo_vmware.api [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 858.966299] env[67899]: value = "task-3467878" [ 858.966299] env[67899]: _type = "Task" [ 858.966299] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 858.974190] env[67899]: DEBUG oslo_vmware.api [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': task-3467878, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 859.477270] env[67899]: DEBUG oslo_vmware.exceptions [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 859.481335] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 859.481335] env[67899]: ERROR nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 859.481335] env[67899]: Faults: ['InvalidArgument'] [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] Traceback (most recent call last): [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] yield resources [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] self.driver.spawn(context, instance, image_meta, [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] self._vmops.spawn(context, instance, image_meta, injected_files, [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] self._fetch_image_if_missing(context, vi) [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] image_cache(vi, tmp_image_ds_loc) [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] vm_util.copy_virtual_disk( [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] session._wait_for_task(vmdk_copy_task) [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] return self.wait_for_task(task_ref) [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] return evt.wait() [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] result = hub.switch() [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] return self.greenlet.switch() [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] self.f(*self.args, **self.kw) [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] raise exceptions.translate_fault(task_info.error) [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] Faults: ['InvalidArgument'] [ 859.481335] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] [ 859.481335] env[67899]: INFO nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Terminating instance [ 859.482258] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 859.482258] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 859.483246] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 859.483503] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 859.484064] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9a8db22e-f96e-4e09-b7f8-e41586ee0a5a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.489162] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3474d13c-142d-4d43-8729-395ca20af9ad {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.494552] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 859.494795] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cc262a95-cc85-4fb0-b6e5-33a0b54baf8e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.498153] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 859.498153] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 859.499565] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-71501fe1-a5de-4328-a26d-e8d134f4a9b2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.505854] env[67899]: DEBUG oslo_vmware.api [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Waiting for the task: (returnval){ [ 859.505854] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]522c3457-4645-f7ed-c850-ba424cdc93c1" [ 859.505854] env[67899]: _type = "Task" [ 859.505854] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 859.513037] env[67899]: DEBUG oslo_vmware.api [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]522c3457-4645-f7ed-c850-ba424cdc93c1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 859.564894] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 859.564955] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 859.565136] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Deleting the datastore file [datastore1] 195a4a1e-3da7-4a69-a679-869346368195 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 859.565397] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c3f3bcae-4be7-4d8d-9626-5b49686b2c1a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.576026] env[67899]: DEBUG oslo_vmware.api [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 859.576026] env[67899]: value = "task-3467880" [ 859.576026] env[67899]: _type = "Task" [ 859.576026] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 859.585249] env[67899]: DEBUG oslo_vmware.api [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': task-3467880, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 860.015425] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 860.015774] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Creating directory with path [datastore1] vmware_temp/e5cfc786-1e82-4a80-829f-cc8407360acb/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 860.016044] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8c8deaf4-92de-445f-8896-b030663ab87a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.029041] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Created directory with path [datastore1] vmware_temp/e5cfc786-1e82-4a80-829f-cc8407360acb/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 860.030379] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Fetch image to [datastore1] vmware_temp/e5cfc786-1e82-4a80-829f-cc8407360acb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 860.030379] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/e5cfc786-1e82-4a80-829f-cc8407360acb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 860.031184] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-125c1b4b-560f-4f90-bbc0-7d369d4236d4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.038552] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29ec9f9a-f32c-4cf3-8f50-6766464d3dd3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.049243] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08822c6a-3bfe-45ff-8130-5137d1c2f515 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.091076] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b17c457-9966-4df5-a658-7e226e9d8597 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.098586] env[67899]: DEBUG oslo_vmware.api [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': task-3467880, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075253} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 860.100402] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 860.101071] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 860.101071] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 860.101241] env[67899]: INFO nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Took 0.62 seconds to destroy the instance on the hypervisor. [ 860.103760] env[67899]: DEBUG nova.compute.claims [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 860.104141] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 860.104482] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 860.107372] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c4cf49ed-4310-4e74-83e7-b41bbe2cffe8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.142691] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 860.221645] env[67899]: DEBUG oslo_vmware.rw_handles [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e5cfc786-1e82-4a80-829f-cc8407360acb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 860.284105] env[67899]: DEBUG oslo_vmware.rw_handles [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 860.284341] env[67899]: DEBUG oslo_vmware.rw_handles [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e5cfc786-1e82-4a80-829f-cc8407360acb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 860.305420] env[67899]: DEBUG nova.scheduler.client.report [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Refreshing inventories for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 860.322134] env[67899]: DEBUG nova.scheduler.client.report [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Updating ProviderTree inventory for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 860.322394] env[67899]: DEBUG nova.compute.provider_tree [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Updating inventory in ProviderTree for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 860.334227] env[67899]: DEBUG nova.scheduler.client.report [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Refreshing aggregate associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, aggregates: None {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 860.354670] env[67899]: DEBUG nova.scheduler.client.report [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Refreshing trait associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, traits: COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 860.745021] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e11e077-f11f-4f71-90c9-70dc42bbd41c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.753678] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7f3b3aa-2f10-4e7f-9ff9-4cba04c80877 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.783869] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ac939a1-1a97-4850-a7eb-c1d1fff55ab1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.791299] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78125c38-d55e-4d9f-8222-19d6753d7e66 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.805049] env[67899]: DEBUG nova.compute.provider_tree [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 860.815327] env[67899]: DEBUG nova.scheduler.client.report [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 860.834367] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.730s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 860.835186] env[67899]: ERROR nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 860.835186] env[67899]: Faults: ['InvalidArgument'] [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] Traceback (most recent call last): [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] self.driver.spawn(context, instance, image_meta, [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] self._vmops.spawn(context, instance, image_meta, injected_files, [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] self._fetch_image_if_missing(context, vi) [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] image_cache(vi, tmp_image_ds_loc) [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] vm_util.copy_virtual_disk( [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] session._wait_for_task(vmdk_copy_task) [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] return self.wait_for_task(task_ref) [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] return evt.wait() [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] result = hub.switch() [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] return self.greenlet.switch() [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] self.f(*self.args, **self.kw) [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] raise exceptions.translate_fault(task_info.error) [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] Faults: ['InvalidArgument'] [ 860.835186] env[67899]: ERROR nova.compute.manager [instance: 195a4a1e-3da7-4a69-a679-869346368195] [ 860.836095] env[67899]: DEBUG nova.compute.utils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 860.837329] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Build of instance 195a4a1e-3da7-4a69-a679-869346368195 was re-scheduled: A specified parameter was not correct: fileType [ 860.837329] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 860.837700] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 860.837871] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 860.838139] env[67899]: DEBUG nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 860.838371] env[67899]: DEBUG nova.network.neutron [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 861.067417] env[67899]: DEBUG oslo_concurrency.lockutils [None req-240bd615-613d-43bb-8393-08313ba6e663 tempest-ServerActionsTestJSON-1261190421 tempest-ServerActionsTestJSON-1261190421-project-member] Acquiring lock "cce79170-e329-4d7a-ab2d-fa6605068897" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 861.067646] env[67899]: DEBUG oslo_concurrency.lockutils [None req-240bd615-613d-43bb-8393-08313ba6e663 tempest-ServerActionsTestJSON-1261190421 tempest-ServerActionsTestJSON-1261190421-project-member] Lock "cce79170-e329-4d7a-ab2d-fa6605068897" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 861.302388] env[67899]: DEBUG nova.network.neutron [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 861.324899] env[67899]: INFO nova.compute.manager [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Took 0.49 seconds to deallocate network for instance. [ 861.444439] env[67899]: INFO nova.scheduler.client.report [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Deleted allocations for instance 195a4a1e-3da7-4a69-a679-869346368195 [ 861.482738] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4e823511-7455-43ab-874e-9acc64f4c5fe tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "195a4a1e-3da7-4a69-a679-869346368195" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 258.485s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.483556] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "195a4a1e-3da7-4a69-a679-869346368195" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 59.176s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 861.483786] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "195a4a1e-3da7-4a69-a679-869346368195-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 861.483993] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "195a4a1e-3da7-4a69-a679-869346368195-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 861.484174] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "195a4a1e-3da7-4a69-a679-869346368195-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.486761] env[67899]: INFO nova.compute.manager [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Terminating instance [ 861.487896] env[67899]: DEBUG nova.compute.manager [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 861.488071] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 861.488584] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5ee3744f-3601-4e8a-95d3-eecf98b03267 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.500780] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb0fa5a8-d435-40ee-a463-44655178251c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.513616] env[67899]: DEBUG nova.compute.manager [None req-095428a0-9722-49dd-850d-4963abda9ddc tempest-ServerDiagnosticsTest-1813722271 tempest-ServerDiagnosticsTest-1813722271-project-member] [instance: 55dfe829-2e96-40d7-bef8-8e7556cbdab3] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 861.535026] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 195a4a1e-3da7-4a69-a679-869346368195 could not be found. [ 861.535026] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 861.535462] env[67899]: INFO nova.compute.manager [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Took 0.05 seconds to destroy the instance on the hypervisor. [ 861.535912] env[67899]: DEBUG oslo.service.loopingcall [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 861.536221] env[67899]: DEBUG nova.compute.manager [-] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 861.536493] env[67899]: DEBUG nova.network.neutron [-] [instance: 195a4a1e-3da7-4a69-a679-869346368195] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 861.543180] env[67899]: DEBUG nova.compute.manager [None req-095428a0-9722-49dd-850d-4963abda9ddc tempest-ServerDiagnosticsTest-1813722271 tempest-ServerDiagnosticsTest-1813722271-project-member] [instance: 55dfe829-2e96-40d7-bef8-8e7556cbdab3] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 861.569181] env[67899]: DEBUG nova.network.neutron [-] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 861.577639] env[67899]: INFO nova.compute.manager [-] [instance: 195a4a1e-3da7-4a69-a679-869346368195] Took 0.04 seconds to deallocate network for instance. [ 861.578796] env[67899]: DEBUG oslo_concurrency.lockutils [None req-095428a0-9722-49dd-850d-4963abda9ddc tempest-ServerDiagnosticsTest-1813722271 tempest-ServerDiagnosticsTest-1813722271-project-member] Lock "55dfe829-2e96-40d7-bef8-8e7556cbdab3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.075s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.591147] env[67899]: DEBUG nova.compute.manager [None req-0bc4c174-e0ab-457c-892d-224b06a89f6b tempest-ServersTestBootFromVolume-1946479315 tempest-ServersTestBootFromVolume-1946479315-project-member] [instance: f935bfef-3ca7-41fc-89be-c5c4e070a401] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 861.619816] env[67899]: DEBUG nova.compute.manager [None req-0bc4c174-e0ab-457c-892d-224b06a89f6b tempest-ServersTestBootFromVolume-1946479315 tempest-ServersTestBootFromVolume-1946479315-project-member] [instance: f935bfef-3ca7-41fc-89be-c5c4e070a401] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 861.645051] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0bc4c174-e0ab-457c-892d-224b06a89f6b tempest-ServersTestBootFromVolume-1946479315 tempest-ServersTestBootFromVolume-1946479315-project-member] Lock "f935bfef-3ca7-41fc-89be-c5c4e070a401" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.380s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.655513] env[67899]: DEBUG nova.compute.manager [None req-2686c5c7-1784-4988-a00b-1b0654ffb429 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: 66d3ec66-244d-4ffa-bd6f-7067f8955e67] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 861.692018] env[67899]: DEBUG nova.compute.manager [None req-2686c5c7-1784-4988-a00b-1b0654ffb429 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: 66d3ec66-244d-4ffa-bd6f-7067f8955e67] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 861.714225] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ed903cc6-3f3e-4b61-9c3e-0ca7ed507b17 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "195a4a1e-3da7-4a69-a679-869346368195" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.231s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.722156] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2686c5c7-1784-4988-a00b-1b0654ffb429 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "66d3ec66-244d-4ffa-bd6f-7067f8955e67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.444s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.731600] env[67899]: DEBUG nova.compute.manager [None req-6c2c8409-20d5-4bd8-9447-eb20abb6685f tempest-ServersV294TestFqdnHostnames-1781220341 tempest-ServersV294TestFqdnHostnames-1781220341-project-member] [instance: bbed830d-aa53-4ea4-93f8-d4b198a333cd] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 861.759523] env[67899]: DEBUG nova.compute.manager [None req-6c2c8409-20d5-4bd8-9447-eb20abb6685f tempest-ServersV294TestFqdnHostnames-1781220341 tempest-ServersV294TestFqdnHostnames-1781220341-project-member] [instance: bbed830d-aa53-4ea4-93f8-d4b198a333cd] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 861.784225] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6c2c8409-20d5-4bd8-9447-eb20abb6685f tempest-ServersV294TestFqdnHostnames-1781220341 tempest-ServersV294TestFqdnHostnames-1781220341-project-member] Lock "bbed830d-aa53-4ea4-93f8-d4b198a333cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.740s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.793820] env[67899]: DEBUG nova.compute.manager [None req-88fcedb6-0a11-495c-89af-f775406e92ad tempest-ServerShowV254Test-945843174 tempest-ServerShowV254Test-945843174-project-member] [instance: 786676eb-ac36-48f6-874a-ab1ca15f2a9a] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 861.825327] env[67899]: DEBUG nova.compute.manager [None req-88fcedb6-0a11-495c-89af-f775406e92ad tempest-ServerShowV254Test-945843174 tempest-ServerShowV254Test-945843174-project-member] [instance: 786676eb-ac36-48f6-874a-ab1ca15f2a9a] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 861.852467] env[67899]: DEBUG oslo_concurrency.lockutils [None req-88fcedb6-0a11-495c-89af-f775406e92ad tempest-ServerShowV254Test-945843174 tempest-ServerShowV254Test-945843174-project-member] Lock "786676eb-ac36-48f6-874a-ab1ca15f2a9a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.061s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.865352] env[67899]: DEBUG nova.compute.manager [None req-84b4cd7b-730e-4d10-a439-50bc81630706 tempest-ServersTestManualDisk-499941049 tempest-ServersTestManualDisk-499941049-project-member] [instance: 4cd5b80b-d1f4-4142-83fb-235523464667] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 861.890456] env[67899]: DEBUG nova.compute.manager [None req-84b4cd7b-730e-4d10-a439-50bc81630706 tempest-ServersTestManualDisk-499941049 tempest-ServersTestManualDisk-499941049-project-member] [instance: 4cd5b80b-d1f4-4142-83fb-235523464667] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 861.922447] env[67899]: DEBUG oslo_concurrency.lockutils [None req-84b4cd7b-730e-4d10-a439-50bc81630706 tempest-ServersTestManualDisk-499941049 tempest-ServersTestManualDisk-499941049-project-member] Lock "4cd5b80b-d1f4-4142-83fb-235523464667" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.409s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.935660] env[67899]: DEBUG nova.compute.manager [None req-97a39c26-6bb7-4c5d-8d9f-30d1a9b2c784 tempest-ServerDiagnosticsV248Test-725047975 tempest-ServerDiagnosticsV248Test-725047975-project-member] [instance: 1589188b-8540-4afd-8050-ab47633593c0] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 861.962534] env[67899]: DEBUG nova.compute.manager [None req-97a39c26-6bb7-4c5d-8d9f-30d1a9b2c784 tempest-ServerDiagnosticsV248Test-725047975 tempest-ServerDiagnosticsV248Test-725047975-project-member] [instance: 1589188b-8540-4afd-8050-ab47633593c0] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 861.987049] env[67899]: DEBUG oslo_concurrency.lockutils [None req-97a39c26-6bb7-4c5d-8d9f-30d1a9b2c784 tempest-ServerDiagnosticsV248Test-725047975 tempest-ServerDiagnosticsV248Test-725047975-project-member] Lock "1589188b-8540-4afd-8050-ab47633593c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.171s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.997196] env[67899]: DEBUG nova.compute.manager [None req-8927c821-393f-439d-a9bd-800d953f5117 tempest-ServerGroupTestJSON-2086428554 tempest-ServerGroupTestJSON-2086428554-project-member] [instance: de1d5572-b82d-4edc-9c7e-a7e26c45a090] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 862.025479] env[67899]: DEBUG nova.compute.manager [None req-8927c821-393f-439d-a9bd-800d953f5117 tempest-ServerGroupTestJSON-2086428554 tempest-ServerGroupTestJSON-2086428554-project-member] [instance: de1d5572-b82d-4edc-9c7e-a7e26c45a090] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 862.053750] env[67899]: DEBUG oslo_concurrency.lockutils [None req-8927c821-393f-439d-a9bd-800d953f5117 tempest-ServerGroupTestJSON-2086428554 tempest-ServerGroupTestJSON-2086428554-project-member] Lock "de1d5572-b82d-4edc-9c7e-a7e26c45a090" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.652s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 862.063319] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 862.132084] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 862.132367] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 862.133978] env[67899]: INFO nova.compute.claims [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 862.648020] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d81d3dfe-339d-4762-b5e8-ed4e59a9cb01 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.655726] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d399836a-5dd0-4952-b7be-3238bf5e7a98 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.696762] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a0d1f66-18f5-4a62-94f5-1c91c9bc468c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.704254] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb777874-b821-4544-bd96-714fabcc6adc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.718127] env[67899]: DEBUG nova.compute.provider_tree [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 862.727441] env[67899]: DEBUG nova.scheduler.client.report [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 862.741458] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.609s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 862.742021] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 862.783804] env[67899]: DEBUG nova.compute.utils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 862.786260] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 862.786455] env[67899]: DEBUG nova.network.neutron [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 862.795150] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 862.875476] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 862.893250] env[67899]: DEBUG nova.policy [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f582973e918246b0951860c4e1798bbe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '39f3f18b4d8d4f7286174fd7217b2d42', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 862.899099] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 862.899477] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 862.899674] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 862.899864] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 862.900158] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 862.900313] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 862.900591] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 862.900696] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 862.900921] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 862.901105] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 862.901283] env[67899]: DEBUG nova.virt.hardware [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 862.902160] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05f01311-bb67-4316-b264-d5e0e14d90f4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.912168] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05c33cfc-52f7-4878-b8fc-724bf0874435 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.413221] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b670e61a-c4b6-468d-bb53-a643f6c18318 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "4bd3cb98-1745-4c1a-8670-9849f70eb554" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 863.413516] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b670e61a-c4b6-468d-bb53-a643f6c18318 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "4bd3cb98-1745-4c1a-8670-9849f70eb554" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 863.686509] env[67899]: DEBUG nova.network.neutron [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Successfully created port: 826c9830-4a37-420b-9521-536d1253acea {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 864.815839] env[67899]: DEBUG nova.network.neutron [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Successfully updated port: 826c9830-4a37-420b-9521-536d1253acea {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 864.833570] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquiring lock "refresh_cache-8d2a9e20-82d3-44cf-a725-59804debe1cc" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 864.833737] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquired lock "refresh_cache-8d2a9e20-82d3-44cf-a725-59804debe1cc" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 864.833900] env[67899]: DEBUG nova.network.neutron [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 864.905154] env[67899]: DEBUG nova.network.neutron [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 865.238926] env[67899]: DEBUG nova.network.neutron [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Updating instance_info_cache with network_info: [{"id": "826c9830-4a37-420b-9521-536d1253acea", "address": "fa:16:3e:b8:fc:61", "network": {"id": "00f96a72-5051-4a3d-8e9c-1704f9265df2", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-2128137148-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "39f3f18b4d8d4f7286174fd7217b2d42", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d646f9d5-d2ad-4c22-bea5-85a965334de6", "external-id": "nsx-vlan-transportzone-606", "segmentation_id": 606, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap826c9830-4a", "ovs_interfaceid": "826c9830-4a37-420b-9521-536d1253acea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 865.256864] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Releasing lock "refresh_cache-8d2a9e20-82d3-44cf-a725-59804debe1cc" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 865.257818] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Instance network_info: |[{"id": "826c9830-4a37-420b-9521-536d1253acea", "address": "fa:16:3e:b8:fc:61", "network": {"id": "00f96a72-5051-4a3d-8e9c-1704f9265df2", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-2128137148-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "39f3f18b4d8d4f7286174fd7217b2d42", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d646f9d5-d2ad-4c22-bea5-85a965334de6", "external-id": "nsx-vlan-transportzone-606", "segmentation_id": 606, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap826c9830-4a", "ovs_interfaceid": "826c9830-4a37-420b-9521-536d1253acea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 865.258515] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b8:fc:61', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd646f9d5-d2ad-4c22-bea5-85a965334de6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '826c9830-4a37-420b-9521-536d1253acea', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 865.269419] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Creating folder: Project (39f3f18b4d8d4f7286174fd7217b2d42). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 865.270196] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-28b01574-32e9-405a-9df6-9da1120abfdd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.283740] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Created folder: Project (39f3f18b4d8d4f7286174fd7217b2d42) in parent group-v692900. [ 865.283740] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Creating folder: Instances. Parent ref: group-v692947. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 865.283740] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eb7066e7-a229-481e-b8f0-6bafaca568da {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.292783] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Created folder: Instances in parent group-v692947. [ 865.292783] env[67899]: DEBUG oslo.service.loopingcall [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 865.292783] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 865.292783] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0bbfda65-c0a5-4ad4-8aff-7990bee19180 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.318990] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 865.318990] env[67899]: value = "task-3467883" [ 865.318990] env[67899]: _type = "Task" [ 865.318990] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 865.327326] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467883, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 865.398084] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquiring lock "8d2a9e20-82d3-44cf-a725-59804debe1cc" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.415613] env[67899]: DEBUG nova.compute.manager [req-272e9c79-becc-423f-b83f-4b25b53301b0 req-56d380d6-503a-401a-94a0-ee66807e03af service nova] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Received event network-vif-plugged-826c9830-4a37-420b-9521-536d1253acea {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 865.415848] env[67899]: DEBUG oslo_concurrency.lockutils [req-272e9c79-becc-423f-b83f-4b25b53301b0 req-56d380d6-503a-401a-94a0-ee66807e03af service nova] Acquiring lock "8d2a9e20-82d3-44cf-a725-59804debe1cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.416062] env[67899]: DEBUG oslo_concurrency.lockutils [req-272e9c79-becc-423f-b83f-4b25b53301b0 req-56d380d6-503a-401a-94a0-ee66807e03af service nova] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 865.416232] env[67899]: DEBUG oslo_concurrency.lockutils [req-272e9c79-becc-423f-b83f-4b25b53301b0 req-56d380d6-503a-401a-94a0-ee66807e03af service nova] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 865.416466] env[67899]: DEBUG nova.compute.manager [req-272e9c79-becc-423f-b83f-4b25b53301b0 req-56d380d6-503a-401a-94a0-ee66807e03af service nova] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] No waiting events found dispatching network-vif-plugged-826c9830-4a37-420b-9521-536d1253acea {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 865.416554] env[67899]: WARNING nova.compute.manager [req-272e9c79-becc-423f-b83f-4b25b53301b0 req-56d380d6-503a-401a-94a0-ee66807e03af service nova] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Received unexpected event network-vif-plugged-826c9830-4a37-420b-9521-536d1253acea for instance with vm_state building and task_state deleting. [ 865.615907] env[67899]: DEBUG oslo_concurrency.lockutils [None req-de06d2c0-f51d-4ed2-a5ca-e6993e42b706 tempest-ServerTagsTestJSON-4885625 tempest-ServerTagsTestJSON-4885625-project-member] Acquiring lock "0bde0bc7-8f34-4941-85f0-44fe5c67e398" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.615907] env[67899]: DEBUG oslo_concurrency.lockutils [None req-de06d2c0-f51d-4ed2-a5ca-e6993e42b706 tempest-ServerTagsTestJSON-4885625 tempest-ServerTagsTestJSON-4885625-project-member] Lock "0bde0bc7-8f34-4941-85f0-44fe5c67e398" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 865.829787] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467883, 'name': CreateVM_Task, 'duration_secs': 0.370636} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 865.830090] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 865.834877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 865.834877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 865.834877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 865.835228] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-36383748-8323-46d3-af0b-09ae8c511c0e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.839904] env[67899]: DEBUG oslo_vmware.api [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Waiting for the task: (returnval){ [ 865.839904] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]526ffae2-9c74-1a62-2617-82709aa991bc" [ 865.839904] env[67899]: _type = "Task" [ 865.839904] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 865.847478] env[67899]: DEBUG oslo_vmware.api [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]526ffae2-9c74-1a62-2617-82709aa991bc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 866.354661] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 866.355008] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 866.355155] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 867.551952] env[67899]: DEBUG nova.compute.manager [req-6f85493e-a990-4064-96ca-38175cd52ede req-ed657bda-f33c-4f09-ac43-bb38ab2dc0b6 service nova] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Received event network-changed-826c9830-4a37-420b-9521-536d1253acea {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 867.552225] env[67899]: DEBUG nova.compute.manager [req-6f85493e-a990-4064-96ca-38175cd52ede req-ed657bda-f33c-4f09-ac43-bb38ab2dc0b6 service nova] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Refreshing instance network info cache due to event network-changed-826c9830-4a37-420b-9521-536d1253acea. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 867.552390] env[67899]: DEBUG oslo_concurrency.lockutils [req-6f85493e-a990-4064-96ca-38175cd52ede req-ed657bda-f33c-4f09-ac43-bb38ab2dc0b6 service nova] Acquiring lock "refresh_cache-8d2a9e20-82d3-44cf-a725-59804debe1cc" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 867.552556] env[67899]: DEBUG oslo_concurrency.lockutils [req-6f85493e-a990-4064-96ca-38175cd52ede req-ed657bda-f33c-4f09-ac43-bb38ab2dc0b6 service nova] Acquired lock "refresh_cache-8d2a9e20-82d3-44cf-a725-59804debe1cc" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 867.552711] env[67899]: DEBUG nova.network.neutron [req-6f85493e-a990-4064-96ca-38175cd52ede req-ed657bda-f33c-4f09-ac43-bb38ab2dc0b6 service nova] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Refreshing network info cache for port 826c9830-4a37-420b-9521-536d1253acea {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 867.895705] env[67899]: DEBUG nova.network.neutron [req-6f85493e-a990-4064-96ca-38175cd52ede req-ed657bda-f33c-4f09-ac43-bb38ab2dc0b6 service nova] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Updated VIF entry in instance network info cache for port 826c9830-4a37-420b-9521-536d1253acea. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 867.895864] env[67899]: DEBUG nova.network.neutron [req-6f85493e-a990-4064-96ca-38175cd52ede req-ed657bda-f33c-4f09-ac43-bb38ab2dc0b6 service nova] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Updating instance_info_cache with network_info: [{"id": "826c9830-4a37-420b-9521-536d1253acea", "address": "fa:16:3e:b8:fc:61", "network": {"id": "00f96a72-5051-4a3d-8e9c-1704f9265df2", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-2128137148-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "39f3f18b4d8d4f7286174fd7217b2d42", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d646f9d5-d2ad-4c22-bea5-85a965334de6", "external-id": "nsx-vlan-transportzone-606", "segmentation_id": 606, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap826c9830-4a", "ovs_interfaceid": "826c9830-4a37-420b-9521-536d1253acea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 867.908658] env[67899]: DEBUG oslo_concurrency.lockutils [req-6f85493e-a990-4064-96ca-38175cd52ede req-ed657bda-f33c-4f09-ac43-bb38ab2dc0b6 service nova] Releasing lock "refresh_cache-8d2a9e20-82d3-44cf-a725-59804debe1cc" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 868.624368] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a6798107-f863-4122-88f3-719cf462c07a tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] Acquiring lock "b79e6007-10ac-4afe-a666-edef64685b22" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 868.624727] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a6798107-f863-4122-88f3-719cf462c07a tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] Lock "b79e6007-10ac-4afe-a666-edef64685b22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 872.465545] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7ec9887f-5049-4efc-a5b7-b56947cc8fb8 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] Acquiring lock "b9143ce6-0592-4cff-a2a1-64874734b214" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 872.465861] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7ec9887f-5049-4efc-a5b7-b56947cc8fb8 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] Lock "b9143ce6-0592-4cff-a2a1-64874734b214" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 886.559077] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66ac858c-b6a2-41dc-8464-b4d137e56bbf tempest-ServersTestJSON-1082368249 tempest-ServersTestJSON-1082368249-project-member] Acquiring lock "868ae015-d365-4a42-8f5d-72faa796fa37" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 886.559077] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66ac858c-b6a2-41dc-8464-b4d137e56bbf tempest-ServersTestJSON-1082368249 tempest-ServersTestJSON-1082368249-project-member] Lock "868ae015-d365-4a42-8f5d-72faa796fa37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 887.679265] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bd38433a-6510-4fab-ad5c-04a8ac1f2888 tempest-ServerRescueTestJSONUnderV235-832255538 tempest-ServerRescueTestJSONUnderV235-832255538-project-member] Acquiring lock "a31cf212-7d4e-4f1c-b494-6b9739b2ef95" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 887.679580] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bd38433a-6510-4fab-ad5c-04a8ac1f2888 tempest-ServerRescueTestJSONUnderV235-832255538 tempest-ServerRescueTestJSONUnderV235-832255538-project-member] Lock "a31cf212-7d4e-4f1c-b494-6b9739b2ef95" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 896.602381] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f87ab63a-06ea-4a52-b5fe-661a784a2b15 tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] Acquiring lock "cb9d29cb-20ee-4875-b993-49cafed344d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 896.602381] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f87ab63a-06ea-4a52-b5fe-661a784a2b15 tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] Lock "cb9d29cb-20ee-4875-b993-49cafed344d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 900.316536] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f35050ea-7d61-4666-8771-22e4ef0303e0 tempest-ServersNegativeTestMultiTenantJSON-1576151238 tempest-ServersNegativeTestMultiTenantJSON-1576151238-project-member] Acquiring lock "eb285233-ef68-4426-827f-3320abe98cac" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 900.317237] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f35050ea-7d61-4666-8771-22e4ef0303e0 tempest-ServersNegativeTestMultiTenantJSON-1576151238 tempest-ServersNegativeTestMultiTenantJSON-1576151238-project-member] Lock "eb285233-ef68-4426-827f-3320abe98cac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 908.100093] env[67899]: WARNING oslo_vmware.rw_handles [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 908.100093] env[67899]: ERROR oslo_vmware.rw_handles [ 908.100692] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/e5cfc786-1e82-4a80-829f-cc8407360acb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 908.102294] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 908.102539] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Copying Virtual Disk [datastore1] vmware_temp/e5cfc786-1e82-4a80-829f-cc8407360acb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/e5cfc786-1e82-4a80-829f-cc8407360acb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 908.102870] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-889ac0c5-bd87-45d3-b16e-e732a2623b31 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.111182] env[67899]: DEBUG oslo_vmware.api [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Waiting for the task: (returnval){ [ 908.111182] env[67899]: value = "task-3467884" [ 908.111182] env[67899]: _type = "Task" [ 908.111182] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 908.119163] env[67899]: DEBUG oslo_vmware.api [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Task: {'id': task-3467884, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 908.621875] env[67899]: DEBUG oslo_vmware.exceptions [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 908.622220] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 908.622807] env[67899]: ERROR nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 908.622807] env[67899]: Faults: ['InvalidArgument'] [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Traceback (most recent call last): [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] yield resources [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] self.driver.spawn(context, instance, image_meta, [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] self._fetch_image_if_missing(context, vi) [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] image_cache(vi, tmp_image_ds_loc) [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] vm_util.copy_virtual_disk( [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] session._wait_for_task(vmdk_copy_task) [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] return self.wait_for_task(task_ref) [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] return evt.wait() [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] result = hub.switch() [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] return self.greenlet.switch() [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] self.f(*self.args, **self.kw) [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] raise exceptions.translate_fault(task_info.error) [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Faults: ['InvalidArgument'] [ 908.622807] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] [ 908.624232] env[67899]: INFO nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Terminating instance [ 908.624695] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 908.624914] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 908.625555] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 908.625739] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 908.625968] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-64fe129c-d269-432c-b919-e52062e27463 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.628306] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27e66f2b-bedf-4e70-b6e3-d14cdf2a9da1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.636558] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 908.636770] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0a379b4d-c5a9-46ee-8490-50406db91ea5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.638978] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 908.639160] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 908.640161] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d58db125-4b42-48fa-ad3c-2d31baeaed2f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.644892] env[67899]: DEBUG oslo_vmware.api [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Waiting for the task: (returnval){ [ 908.644892] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]520d52bf-4b5b-f13a-c224-d2e60579a885" [ 908.644892] env[67899]: _type = "Task" [ 908.644892] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 908.654447] env[67899]: DEBUG oslo_vmware.api [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]520d52bf-4b5b-f13a-c224-d2e60579a885, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 908.708686] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 908.708915] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 908.709105] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Deleting the datastore file [datastore1] 7a19bcfd-5544-4688-8edb-e12c567979ae {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 908.709382] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1d2073db-0aa8-4f63-b2d2-819b221be744 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.715618] env[67899]: DEBUG oslo_vmware.api [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Waiting for the task: (returnval){ [ 908.715618] env[67899]: value = "task-3467886" [ 908.715618] env[67899]: _type = "Task" [ 908.715618] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 908.722774] env[67899]: DEBUG oslo_vmware.api [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Task: {'id': task-3467886, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 909.155456] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 909.155757] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Creating directory with path [datastore1] vmware_temp/66fcc6c6-2579-4b13-8b90-24713c101aa9/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 909.155955] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e46a3f69-18f5-4e4c-9092-06ca565dc487 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.169280] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Created directory with path [datastore1] vmware_temp/66fcc6c6-2579-4b13-8b90-24713c101aa9/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 909.169475] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Fetch image to [datastore1] vmware_temp/66fcc6c6-2579-4b13-8b90-24713c101aa9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 909.169684] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/66fcc6c6-2579-4b13-8b90-24713c101aa9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 909.170400] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fff04794-d5f0-4c2d-91c8-4a19bda54a4d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.176948] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a4e69b5-f1b6-4d3f-89eb-704da2050e27 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.185960] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f2b568e-2f83-455f-8636-6f52d1623c7c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.222842] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9748e7f6-650d-4494-8056-83f3dffbf909 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.229772] env[67899]: DEBUG oslo_vmware.api [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Task: {'id': task-3467886, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065979} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 909.231375] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 909.231698] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 909.231766] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 909.231907] env[67899]: INFO nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Took 0.61 seconds to destroy the instance on the hypervisor. [ 909.233680] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2d96b214-0947-4038-987c-0213cb9f4151 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.235570] env[67899]: DEBUG nova.compute.claims [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 909.238035] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 909.238035] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 909.256385] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 909.342048] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/66fcc6c6-2579-4b13-8b90-24713c101aa9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 909.406514] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 909.406764] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/66fcc6c6-2579-4b13-8b90-24713c101aa9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 909.676027] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13534eaf-97f7-442b-8012-1d9e371dab7d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.683354] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-305c68cf-b46a-4cc3-aa3a-949f31998168 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.713437] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d11abf5-5367-4dec-b3aa-d5c213b41d4d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.720537] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8383c656-7143-4114-89fc-81118e56beb0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 909.733428] env[67899]: DEBUG nova.compute.provider_tree [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 909.742130] env[67899]: DEBUG nova.scheduler.client.report [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 909.756027] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.519s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 909.756027] env[67899]: ERROR nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 909.756027] env[67899]: Faults: ['InvalidArgument'] [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Traceback (most recent call last): [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] self.driver.spawn(context, instance, image_meta, [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] self._fetch_image_if_missing(context, vi) [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] image_cache(vi, tmp_image_ds_loc) [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] vm_util.copy_virtual_disk( [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] session._wait_for_task(vmdk_copy_task) [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] return self.wait_for_task(task_ref) [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] return evt.wait() [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] result = hub.switch() [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] return self.greenlet.switch() [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] self.f(*self.args, **self.kw) [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] raise exceptions.translate_fault(task_info.error) [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Faults: ['InvalidArgument'] [ 909.756027] env[67899]: ERROR nova.compute.manager [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] [ 909.756968] env[67899]: DEBUG nova.compute.utils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 909.757969] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Build of instance 7a19bcfd-5544-4688-8edb-e12c567979ae was re-scheduled: A specified parameter was not correct: fileType [ 909.757969] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 909.758385] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 909.758537] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 909.758708] env[67899]: DEBUG nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 909.758883] env[67899]: DEBUG nova.network.neutron [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 910.064715] env[67899]: DEBUG nova.network.neutron [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 910.078618] env[67899]: INFO nova.compute.manager [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Took 0.32 seconds to deallocate network for instance. [ 910.208108] env[67899]: INFO nova.scheduler.client.report [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Deleted allocations for instance 7a19bcfd-5544-4688-8edb-e12c567979ae [ 910.232472] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66773c7e-ea1c-4033-b311-33b753d740c6 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "7a19bcfd-5544-4688-8edb-e12c567979ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 299.038s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.232732] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "7a19bcfd-5544-4688-8edb-e12c567979ae" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 101.614s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 910.232957] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Acquiring lock "7a19bcfd-5544-4688-8edb-e12c567979ae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 910.233175] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "7a19bcfd-5544-4688-8edb-e12c567979ae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 910.233343] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "7a19bcfd-5544-4688-8edb-e12c567979ae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.237740] env[67899]: INFO nova.compute.manager [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Terminating instance [ 910.237988] env[67899]: DEBUG nova.compute.manager [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 910.238251] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 910.238542] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1d2cf469-2a6a-4910-96f9-ef62f52083c0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.248600] env[67899]: DEBUG nova.compute.manager [None req-903b8c8a-e8bb-4076-a4c9-50cf6e2f6cbb tempest-ServerAddressesTestJSON-560381368 tempest-ServerAddressesTestJSON-560381368-project-member] [instance: e42425fa-6c50-4e76-842b-0bfcccb011c0] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.253662] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c1c3de6-7475-429d-a80c-06cd5c3d1d85 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 910.284833] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7a19bcfd-5544-4688-8edb-e12c567979ae could not be found. [ 910.284833] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 910.284833] env[67899]: INFO nova.compute.manager [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Took 0.05 seconds to destroy the instance on the hypervisor. [ 910.285262] env[67899]: DEBUG oslo.service.loopingcall [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 910.285455] env[67899]: DEBUG nova.compute.manager [None req-903b8c8a-e8bb-4076-a4c9-50cf6e2f6cbb tempest-ServerAddressesTestJSON-560381368 tempest-ServerAddressesTestJSON-560381368-project-member] [instance: e42425fa-6c50-4e76-842b-0bfcccb011c0] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.288404] env[67899]: DEBUG nova.compute.manager [-] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 910.288404] env[67899]: DEBUG nova.network.neutron [-] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 910.317846] env[67899]: DEBUG nova.network.neutron [-] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 910.325290] env[67899]: DEBUG oslo_concurrency.lockutils [None req-903b8c8a-e8bb-4076-a4c9-50cf6e2f6cbb tempest-ServerAddressesTestJSON-560381368 tempest-ServerAddressesTestJSON-560381368-project-member] Lock "e42425fa-6c50-4e76-842b-0bfcccb011c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.934s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.327431] env[67899]: INFO nova.compute.manager [-] [instance: 7a19bcfd-5544-4688-8edb-e12c567979ae] Took 0.04 seconds to deallocate network for instance. [ 910.342506] env[67899]: DEBUG nova.compute.manager [None req-06ca25cb-cd45-4102-b595-3289090f9f6e tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] [instance: 96d79732-9076-4715-aa1e-60001ffb17fb] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.375576] env[67899]: DEBUG nova.compute.manager [None req-06ca25cb-cd45-4102-b595-3289090f9f6e tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] [instance: 96d79732-9076-4715-aa1e-60001ffb17fb] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.400241] env[67899]: DEBUG oslo_concurrency.lockutils [None req-06ca25cb-cd45-4102-b595-3289090f9f6e tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] Lock "96d79732-9076-4715-aa1e-60001ffb17fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.913s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.414826] env[67899]: DEBUG nova.compute.manager [None req-4f014c71-1629-4992-b9cc-369400efe2b9 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] [instance: f7888060-430b-4b16-b9ca-059020615dee] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.455298] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9db54cf0-1508-430f-b1cd-e962a2f376c7 tempest-AttachInterfacesUnderV243Test-1421382693 tempest-AttachInterfacesUnderV243Test-1421382693-project-member] Lock "7a19bcfd-5544-4688-8edb-e12c567979ae" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.222s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.457339] env[67899]: DEBUG nova.compute.manager [None req-4f014c71-1629-4992-b9cc-369400efe2b9 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] [instance: f7888060-430b-4b16-b9ca-059020615dee] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.478221] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f014c71-1629-4992-b9cc-369400efe2b9 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] Lock "f7888060-430b-4b16-b9ca-059020615dee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.617s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.489677] env[67899]: DEBUG nova.compute.manager [None req-35796b21-4216-449c-a9a9-9fe6ecab3f1b tempest-ServerPasswordTestJSON-1822365130 tempest-ServerPasswordTestJSON-1822365130-project-member] [instance: 1505bcf5-f622-40ee-93c2-8dabf1dce8cb] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.515046] env[67899]: DEBUG nova.compute.manager [None req-35796b21-4216-449c-a9a9-9fe6ecab3f1b tempest-ServerPasswordTestJSON-1822365130 tempest-ServerPasswordTestJSON-1822365130-project-member] [instance: 1505bcf5-f622-40ee-93c2-8dabf1dce8cb] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.535121] env[67899]: DEBUG oslo_concurrency.lockutils [None req-35796b21-4216-449c-a9a9-9fe6ecab3f1b tempest-ServerPasswordTestJSON-1822365130 tempest-ServerPasswordTestJSON-1822365130-project-member] Lock "1505bcf5-f622-40ee-93c2-8dabf1dce8cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.651s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.544122] env[67899]: DEBUG nova.compute.manager [None req-387e0548-0c20-4821-a34f-501df96dea85 tempest-AttachInterfacesV270Test-551286907 tempest-AttachInterfacesV270Test-551286907-project-member] [instance: 4c281caa-f99d-40d5-b004-13e7856a29f5] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.566734] env[67899]: DEBUG nova.compute.manager [None req-387e0548-0c20-4821-a34f-501df96dea85 tempest-AttachInterfacesV270Test-551286907 tempest-AttachInterfacesV270Test-551286907-project-member] [instance: 4c281caa-f99d-40d5-b004-13e7856a29f5] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.587739] env[67899]: DEBUG oslo_concurrency.lockutils [None req-387e0548-0c20-4821-a34f-501df96dea85 tempest-AttachInterfacesV270Test-551286907 tempest-AttachInterfacesV270Test-551286907-project-member] Lock "4c281caa-f99d-40d5-b004-13e7856a29f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.991s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.598529] env[67899]: DEBUG nova.compute.manager [None req-60abf12b-7e5a-42dd-96b7-3336f10ab46a tempest-InstanceActionsNegativeTestJSON-2021516092 tempest-InstanceActionsNegativeTestJSON-2021516092-project-member] [instance: aa6229be-c18c-4cf9-99a1-ca546b30d797] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.623325] env[67899]: DEBUG nova.compute.manager [None req-60abf12b-7e5a-42dd-96b7-3336f10ab46a tempest-InstanceActionsNegativeTestJSON-2021516092 tempest-InstanceActionsNegativeTestJSON-2021516092-project-member] [instance: aa6229be-c18c-4cf9-99a1-ca546b30d797] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.649294] env[67899]: DEBUG oslo_concurrency.lockutils [None req-60abf12b-7e5a-42dd-96b7-3336f10ab46a tempest-InstanceActionsNegativeTestJSON-2021516092 tempest-InstanceActionsNegativeTestJSON-2021516092-project-member] Lock "aa6229be-c18c-4cf9-99a1-ca546b30d797" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.028s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.660372] env[67899]: DEBUG nova.compute.manager [None req-828045de-c2cc-4771-be1a-f63e2fc9d20a tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] [instance: 862297c3-0b85-43eb-b364-303bb0c0b077] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.695172] env[67899]: DEBUG nova.compute.manager [None req-828045de-c2cc-4771-be1a-f63e2fc9d20a tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] [instance: 862297c3-0b85-43eb-b364-303bb0c0b077] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.717013] env[67899]: DEBUG oslo_concurrency.lockutils [None req-828045de-c2cc-4771-be1a-f63e2fc9d20a tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] Lock "862297c3-0b85-43eb-b364-303bb0c0b077" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.092s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.726575] env[67899]: DEBUG nova.compute.manager [None req-ec5b86f6-cc1c-4c25-84be-c14c32124342 tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] [instance: 641b8e97-b9e6-4ef0-a819-42d3a29429de] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.753308] env[67899]: DEBUG nova.compute.manager [None req-ec5b86f6-cc1c-4c25-84be-c14c32124342 tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] [instance: 641b8e97-b9e6-4ef0-a819-42d3a29429de] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.776375] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ec5b86f6-cc1c-4c25-84be-c14c32124342 tempest-ServersAdminTestJSON-374353603 tempest-ServersAdminTestJSON-374353603-project-member] Lock "641b8e97-b9e6-4ef0-a819-42d3a29429de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.805s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.786116] env[67899]: DEBUG nova.compute.manager [None req-59a8bff3-3a22-4f4e-b4f3-5e8968fb244c tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] [instance: d0ceaa4e-9c87-48de-bcc2-8bb537827c0a] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.815290] env[67899]: DEBUG nova.compute.manager [None req-59a8bff3-3a22-4f4e-b4f3-5e8968fb244c tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] [instance: d0ceaa4e-9c87-48de-bcc2-8bb537827c0a] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.840381] env[67899]: DEBUG oslo_concurrency.lockutils [None req-59a8bff3-3a22-4f4e-b4f3-5e8968fb244c tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] Lock "d0ceaa4e-9c87-48de-bcc2-8bb537827c0a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.021s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.849610] env[67899]: DEBUG nova.compute.manager [None req-80443c34-53f4-4de0-af11-620e89f407a2 tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] [instance: 9842d097-f4f2-4f60-aea0-08896a47ff53] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.874736] env[67899]: DEBUG nova.compute.manager [None req-80443c34-53f4-4de0-af11-620e89f407a2 tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] [instance: 9842d097-f4f2-4f60-aea0-08896a47ff53] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 910.898310] env[67899]: DEBUG oslo_concurrency.lockutils [None req-80443c34-53f4-4de0-af11-620e89f407a2 tempest-ServerRescueNegativeTestJSON-1208678464 tempest-ServerRescueNegativeTestJSON-1208678464-project-member] Lock "9842d097-f4f2-4f60-aea0-08896a47ff53" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.128s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 910.910020] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 910.972871] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 910.973140] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 910.974635] env[67899]: INFO nova.compute.claims [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 911.389131] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6b86ea7-7316-411e-8eae-9e864ac5f37d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.397557] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4ec32b8-fb90-4558-99d5-de20c883cd29 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.434046] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38bb6f16-6e47-48cb-b8ff-6789cf336d0d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.441526] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef948245-cdb6-4333-bf18-75f25957ee41 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.454444] env[67899]: DEBUG nova.compute.provider_tree [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 911.468610] env[67899]: DEBUG nova.scheduler.client.report [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 911.484020] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.509s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 911.484020] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 911.519715] env[67899]: DEBUG nova.compute.utils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 911.520751] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Not allocating networking since 'none' was specified. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 911.538306] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 911.607386] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 911.632669] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 911.632955] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 911.633164] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 911.633400] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 911.633579] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 911.633781] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 911.634104] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 911.634310] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 911.634514] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 911.634706] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 911.634910] env[67899]: DEBUG nova.virt.hardware [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 911.635861] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1172b77f-6801-492e-bfb7-9869cadbdc50 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.644122] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8ad0cd6-ea18-40fe-8995-deb800ad7d83 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.657789] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Instance VIF info [] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 911.663389] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Creating folder: Project (17d24f70b07244878758f128442b2e25). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 911.663717] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-db7e8591-599d-43bb-99e6-734acc59e54f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.673242] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Created folder: Project (17d24f70b07244878758f128442b2e25) in parent group-v692900. [ 911.673463] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Creating folder: Instances. Parent ref: group-v692950. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 911.673820] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-66b67541-8862-4943-a045-b8101848dd8b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.682132] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Created folder: Instances in parent group-v692950. [ 911.682361] env[67899]: DEBUG oslo.service.loopingcall [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 911.682546] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 911.682770] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-051bd20b-94f1-4044-8fa6-4c8a7b08c5ab {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 911.698610] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 911.698610] env[67899]: value = "task-3467889" [ 911.698610] env[67899]: _type = "Task" [ 911.698610] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 911.705987] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467889, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 912.212120] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467889, 'name': CreateVM_Task, 'duration_secs': 0.27459} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 912.212120] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 912.212120] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 912.212120] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 912.212323] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 912.213032] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ab1074f6-6a64-4b53-a0a0-50c49309410b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 912.216948] env[67899]: DEBUG oslo_vmware.api [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Waiting for the task: (returnval){ [ 912.216948] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52306f18-b611-c26f-823b-a0f47f22b95c" [ 912.216948] env[67899]: _type = "Task" [ 912.216948] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 912.227463] env[67899]: DEBUG oslo_vmware.api [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52306f18-b611-c26f-823b-a0f47f22b95c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 912.730306] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 912.730582] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 912.730799] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 914.992140] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 915.018429] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 915.018626] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 915.032553] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 915.033119] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 915.033315] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 915.033581] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 915.034662] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-897164e7-108d-4a83-9b27-403f6822fe6f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.043455] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07bf73e0-f6e3-44a2-8665-e0d9446f9499 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.058910] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40978630-cf43-457a-8db0-f2ec328656e6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.065365] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e5cd65c-17a1-4845-9369-238f0294e388 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.101960] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180931MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 915.102166] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 915.102739] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 915.182993] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 84cbacaa-08d2-4297-8777-150f433e4c04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.183166] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c29ae4c5-cc93-480c-8d60-96f6acba4346 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.183290] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.183411] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 913c5652-c8af-41a8-94f1-c0eba08aacdd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.183531] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.183773] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.183997] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.184154] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.184280] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.184401] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 915.205863] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.229942] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.241777] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4cbf5a4d-9466-4bc6-adc9-973759545cf4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.258306] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3fabbf48-5df3-4e36-a9d8-494c221304b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.269844] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7b6a4c60-1b40-44b8-b341-3dcaf1716c99 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.285359] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 183fd334-b0e1-479a-b38a-62f21c176d17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.297338] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.310640] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cce79170-e329-4d7a-ab2d-fa6605068897 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.323448] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4bd3cb98-1745-4c1a-8670-9849f70eb554 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.336103] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 0bde0bc7-8f34-4941-85f0-44fe5c67e398 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.347645] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b79e6007-10ac-4afe-a666-edef64685b22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.359092] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9143ce6-0592-4cff-a2a1-64874734b214 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.370157] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 868ae015-d365-4a42-8f5d-72faa796fa37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.381415] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a31cf212-7d4e-4f1c-b494-6b9739b2ef95 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.396166] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cb9d29cb-20ee-4875-b993-49cafed344d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.408613] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance eb285233-ef68-4426-827f-3320abe98cac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 915.408881] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 915.409045] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 915.526122] env[67899]: DEBUG oslo_concurrency.lockutils [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 915.814803] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-492caa71-7d9a-464f-a798-8649d4add037 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.823126] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ad8d045-c361-49e5-948f-c6794fa837c4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.855903] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bd464dc-b5b4-41d6-8672-704e526f9eda {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.863598] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb7bca9b-e8b1-4b49-951c-55af081d1b7f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.877089] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 915.886921] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 915.905259] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 915.905456] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.803s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 917.883310] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 917.883651] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 917.883651] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 917.917176] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.917248] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.917385] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.917512] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.917634] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.917754] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.917876] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.917993] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.921527] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.921699] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 917.921838] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 917.923604] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 917.924081] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 917.925086] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 918.033708] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 918.316389] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquiring lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 918.316772] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 918.996168] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 918.996472] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 919.996493] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 956.376374] env[67899]: WARNING oslo_vmware.rw_handles [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 956.376374] env[67899]: ERROR oslo_vmware.rw_handles [ 956.376374] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/66fcc6c6-2579-4b13-8b90-24713c101aa9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 956.378401] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 956.378669] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Copying Virtual Disk [datastore1] vmware_temp/66fcc6c6-2579-4b13-8b90-24713c101aa9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/66fcc6c6-2579-4b13-8b90-24713c101aa9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 956.379076] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c2c24662-76dd-4fcd-8278-3675f727e06f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.389071] env[67899]: DEBUG oslo_vmware.api [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Waiting for the task: (returnval){ [ 956.389071] env[67899]: value = "task-3467890" [ 956.389071] env[67899]: _type = "Task" [ 956.389071] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 956.397020] env[67899]: DEBUG oslo_vmware.api [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Task: {'id': task-3467890, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 956.899566] env[67899]: DEBUG oslo_vmware.exceptions [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 956.899853] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 956.900409] env[67899]: ERROR nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 956.900409] env[67899]: Faults: ['InvalidArgument'] [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Traceback (most recent call last): [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] yield resources [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] self.driver.spawn(context, instance, image_meta, [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] self._vmops.spawn(context, instance, image_meta, injected_files, [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] self._fetch_image_if_missing(context, vi) [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] image_cache(vi, tmp_image_ds_loc) [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] vm_util.copy_virtual_disk( [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] session._wait_for_task(vmdk_copy_task) [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] return self.wait_for_task(task_ref) [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] return evt.wait() [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] result = hub.switch() [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] return self.greenlet.switch() [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] self.f(*self.args, **self.kw) [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] raise exceptions.translate_fault(task_info.error) [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Faults: ['InvalidArgument'] [ 956.900409] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] [ 956.901550] env[67899]: INFO nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Terminating instance [ 956.902609] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 956.902609] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 956.902778] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e01c8d9d-424a-4ea5-8e8f-6a4d50a65b8e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.904996] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 956.905198] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 956.905925] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-911ff98e-5574-4d79-a825-2806939a3a90 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.913776] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 956.914070] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c3b103f5-b085-4e6d-8f54-1b17fdc4926e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.916233] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 956.916403] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 956.917322] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6fafc7bc-5a75-4b1b-befc-61cab07324d2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.921767] env[67899]: DEBUG oslo_vmware.api [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Waiting for the task: (returnval){ [ 956.921767] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]525036da-f880-1f75-9176-4ad25536ab61" [ 956.921767] env[67899]: _type = "Task" [ 956.921767] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 956.928723] env[67899]: DEBUG oslo_vmware.api [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]525036da-f880-1f75-9176-4ad25536ab61, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 956.985479] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 956.985719] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 956.985903] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Deleting the datastore file [datastore1] 84cbacaa-08d2-4297-8777-150f433e4c04 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 956.986186] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3a2d89ef-27cb-4e87-9a7f-83407ce50491 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.995432] env[67899]: DEBUG oslo_vmware.api [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Waiting for the task: (returnval){ [ 956.995432] env[67899]: value = "task-3467892" [ 956.995432] env[67899]: _type = "Task" [ 956.995432] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 957.003640] env[67899]: DEBUG oslo_vmware.api [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Task: {'id': task-3467892, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 957.432152] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 957.432506] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Creating directory with path [datastore1] vmware_temp/5ad9fda9-9b46-4709-91a3-efe3a0412786/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 957.432602] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-41f84bf8-a659-4453-9d9a-ff9ca9841483 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.443569] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Created directory with path [datastore1] vmware_temp/5ad9fda9-9b46-4709-91a3-efe3a0412786/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 957.443749] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Fetch image to [datastore1] vmware_temp/5ad9fda9-9b46-4709-91a3-efe3a0412786/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 957.443930] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/5ad9fda9-9b46-4709-91a3-efe3a0412786/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 957.444635] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-731943ab-953e-48bc-af9b-cae805411058 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.451021] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cda4d6b9-a517-46c7-8494-9e31884d3cbc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.459850] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-508ae803-ee6d-4f0c-a7b7-ca5b6376afad {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.490223] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c2befbe-c25a-48a4-b3a6-3ba04a69f00d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.495682] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ebcc87d6-c36f-447a-86cf-bfc16f4156ca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.505500] env[67899]: DEBUG oslo_vmware.api [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Task: {'id': task-3467892, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079325} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 957.505737] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 957.505914] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 957.506128] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 957.506358] env[67899]: INFO nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Took 0.60 seconds to destroy the instance on the hypervisor. [ 957.508501] env[67899]: DEBUG nova.compute.claims [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 957.508664] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 957.508873] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 957.522271] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 957.590289] env[67899]: DEBUG oslo_vmware.rw_handles [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5ad9fda9-9b46-4709-91a3-efe3a0412786/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 957.658088] env[67899]: DEBUG oslo_vmware.rw_handles [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 957.658317] env[67899]: DEBUG oslo_vmware.rw_handles [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5ad9fda9-9b46-4709-91a3-efe3a0412786/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 957.954493] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65d4f19f-055b-4d6a-8022-e8c54fb97449 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 957.967934] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-535c895c-2dc6-4e64-abe7-44859d4b502b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.007943] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ab2dd23-08bd-4004-a5e4-e88d2f54fe3b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.015492] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6dbca2a-864e-4463-8fec-b7d99b0685ab {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.028576] env[67899]: DEBUG nova.compute.provider_tree [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 958.037536] env[67899]: DEBUG nova.scheduler.client.report [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 958.051363] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.542s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 958.051881] env[67899]: ERROR nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 958.051881] env[67899]: Faults: ['InvalidArgument'] [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Traceback (most recent call last): [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] self.driver.spawn(context, instance, image_meta, [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] self._vmops.spawn(context, instance, image_meta, injected_files, [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] self._fetch_image_if_missing(context, vi) [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] image_cache(vi, tmp_image_ds_loc) [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] vm_util.copy_virtual_disk( [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] session._wait_for_task(vmdk_copy_task) [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] return self.wait_for_task(task_ref) [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] return evt.wait() [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] result = hub.switch() [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] return self.greenlet.switch() [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] self.f(*self.args, **self.kw) [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] raise exceptions.translate_fault(task_info.error) [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Faults: ['InvalidArgument'] [ 958.051881] env[67899]: ERROR nova.compute.manager [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] [ 958.052928] env[67899]: DEBUG nova.compute.utils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 958.054034] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Build of instance 84cbacaa-08d2-4297-8777-150f433e4c04 was re-scheduled: A specified parameter was not correct: fileType [ 958.054034] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 958.054479] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 958.054651] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 958.054820] env[67899]: DEBUG nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 958.054980] env[67899]: DEBUG nova.network.neutron [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 958.421802] env[67899]: DEBUG nova.network.neutron [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 958.437476] env[67899]: INFO nova.compute.manager [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Took 0.38 seconds to deallocate network for instance. [ 958.557250] env[67899]: INFO nova.scheduler.client.report [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Deleted allocations for instance 84cbacaa-08d2-4297-8777-150f433e4c04 [ 958.575996] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a5a53ea7-13aa-44a2-a12d-ac8f3fe9dd29 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "84cbacaa-08d2-4297-8777-150f433e4c04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 337.431s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 958.577093] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "84cbacaa-08d2-4297-8777-150f433e4c04" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 138.142s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 958.577333] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Acquiring lock "84cbacaa-08d2-4297-8777-150f433e4c04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 958.577537] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "84cbacaa-08d2-4297-8777-150f433e4c04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 958.577949] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "84cbacaa-08d2-4297-8777-150f433e4c04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 958.579780] env[67899]: INFO nova.compute.manager [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Terminating instance [ 958.581730] env[67899]: DEBUG nova.compute.manager [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 958.581866] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 958.582465] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e1126679-80bf-43cb-92e4-b4efde5509ae {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.587481] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 958.596026] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb4340ff-ecd7-4fd3-b740-48cbb604f98d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.624157] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 84cbacaa-08d2-4297-8777-150f433e4c04 could not be found. [ 958.624774] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 958.624774] env[67899]: INFO nova.compute.manager [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Took 0.04 seconds to destroy the instance on the hypervisor. [ 958.624774] env[67899]: DEBUG oslo.service.loopingcall [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 958.626997] env[67899]: DEBUG nova.compute.manager [-] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 958.627125] env[67899]: DEBUG nova.network.neutron [-] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 958.640158] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 958.640412] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 958.641880] env[67899]: INFO nova.compute.claims [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 958.661602] env[67899]: DEBUG nova.network.neutron [-] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 958.676841] env[67899]: INFO nova.compute.manager [-] [instance: 84cbacaa-08d2-4297-8777-150f433e4c04] Took 0.05 seconds to deallocate network for instance. [ 958.773318] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d6968a70-d2f0-4447-9223-19f375b45736 tempest-ImagesNegativeTestJSON-1293308713 tempest-ImagesNegativeTestJSON-1293308713-project-member] Lock "84cbacaa-08d2-4297-8777-150f433e4c04" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.196s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 959.036102] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbea3bf2-5f93-4cd7-8b25-6d62b4ebe525 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.044270] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51da7063-5db3-4df4-b952-364be2f8afcf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.075771] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-685c7562-9398-4d58-afe6-85e59f5a7031 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.082946] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cde1e0ca-1ce7-4698-98be-2d291453c629 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.095959] env[67899]: DEBUG nova.compute.provider_tree [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 959.104968] env[67899]: DEBUG nova.scheduler.client.report [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 959.119034] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.479s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 959.119502] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 959.151304] env[67899]: DEBUG nova.compute.utils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 959.152790] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 959.152981] env[67899]: DEBUG nova.network.neutron [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 959.161202] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 959.230404] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 959.254785] env[67899]: DEBUG nova.policy [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a9bde0c3c4644eec9a7d62f405f81f52', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2a1bb847e0034dcdb632882fb106d511', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 959.260794] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 959.261067] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 959.261246] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 959.261513] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 959.261581] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 959.261703] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 959.261905] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 959.262074] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 959.262247] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 959.262405] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 959.262572] env[67899]: DEBUG nova.virt.hardware [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 959.263435] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fbb0f19-c5e1-4be9-966e-7251434e9759 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.271509] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1622cac4-f96e-4498-9760-c7e86a5b253b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.941663] env[67899]: DEBUG nova.network.neutron [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Successfully created port: 461efbd1-e524-4f43-81ae-0627f345720c {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 960.771605] env[67899]: DEBUG nova.compute.manager [req-5682d1cb-febb-4911-af56-3c4941143069 req-91a56881-b777-4799-9e8e-1e87ff3414b6 service nova] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Received event network-vif-plugged-461efbd1-e524-4f43-81ae-0627f345720c {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 960.771890] env[67899]: DEBUG oslo_concurrency.lockutils [req-5682d1cb-febb-4911-af56-3c4941143069 req-91a56881-b777-4799-9e8e-1e87ff3414b6 service nova] Acquiring lock "37ab08db-50ab-4c30-9e18-05007c5d1c27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 960.772057] env[67899]: DEBUG oslo_concurrency.lockutils [req-5682d1cb-febb-4911-af56-3c4941143069 req-91a56881-b777-4799-9e8e-1e87ff3414b6 service nova] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 960.772230] env[67899]: DEBUG oslo_concurrency.lockutils [req-5682d1cb-febb-4911-af56-3c4941143069 req-91a56881-b777-4799-9e8e-1e87ff3414b6 service nova] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 960.772393] env[67899]: DEBUG nova.compute.manager [req-5682d1cb-febb-4911-af56-3c4941143069 req-91a56881-b777-4799-9e8e-1e87ff3414b6 service nova] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] No waiting events found dispatching network-vif-plugged-461efbd1-e524-4f43-81ae-0627f345720c {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 960.772551] env[67899]: WARNING nova.compute.manager [req-5682d1cb-febb-4911-af56-3c4941143069 req-91a56881-b777-4799-9e8e-1e87ff3414b6 service nova] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Received unexpected event network-vif-plugged-461efbd1-e524-4f43-81ae-0627f345720c for instance with vm_state building and task_state spawning. [ 960.906827] env[67899]: DEBUG nova.network.neutron [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Successfully updated port: 461efbd1-e524-4f43-81ae-0627f345720c {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 960.924392] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquiring lock "refresh_cache-37ab08db-50ab-4c30-9e18-05007c5d1c27" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 960.924392] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquired lock "refresh_cache-37ab08db-50ab-4c30-9e18-05007c5d1c27" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 960.924392] env[67899]: DEBUG nova.network.neutron [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 960.990275] env[67899]: DEBUG nova.network.neutron [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 961.218922] env[67899]: DEBUG nova.network.neutron [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Updating instance_info_cache with network_info: [{"id": "461efbd1-e524-4f43-81ae-0627f345720c", "address": "fa:16:3e:83:f8:d0", "network": {"id": "f7b7998a-031a-4bb0-9556-ae33d16aa727", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-328690567-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2a1bb847e0034dcdb632882fb106d511", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4ad9ee0f-6a58-4a7b-bda3-5249b8cef84e", "external-id": "nsx-vlan-transportzone-354", "segmentation_id": 354, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap461efbd1-e5", "ovs_interfaceid": "461efbd1-e524-4f43-81ae-0627f345720c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 961.236205] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Releasing lock "refresh_cache-37ab08db-50ab-4c30-9e18-05007c5d1c27" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 961.236205] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Instance network_info: |[{"id": "461efbd1-e524-4f43-81ae-0627f345720c", "address": "fa:16:3e:83:f8:d0", "network": {"id": "f7b7998a-031a-4bb0-9556-ae33d16aa727", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-328690567-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2a1bb847e0034dcdb632882fb106d511", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4ad9ee0f-6a58-4a7b-bda3-5249b8cef84e", "external-id": "nsx-vlan-transportzone-354", "segmentation_id": 354, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap461efbd1-e5", "ovs_interfaceid": "461efbd1-e524-4f43-81ae-0627f345720c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 961.236205] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:83:f8:d0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4ad9ee0f-6a58-4a7b-bda3-5249b8cef84e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '461efbd1-e524-4f43-81ae-0627f345720c', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 961.244063] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Creating folder: Project (2a1bb847e0034dcdb632882fb106d511). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 961.244794] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2bd4ec15-ef5d-488d-b62f-d97e5fd1dd97 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 961.260679] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Created folder: Project (2a1bb847e0034dcdb632882fb106d511) in parent group-v692900. [ 961.260679] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Creating folder: Instances. Parent ref: group-v692953. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 961.260679] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ce262330-dd80-4d6b-9cdd-6ed206e0b1f4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 961.267639] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Created folder: Instances in parent group-v692953. [ 961.268097] env[67899]: DEBUG oslo.service.loopingcall [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 961.268729] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 961.268729] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-02ae81dc-4aba-4619-a52c-b07963691394 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 961.288074] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 961.288074] env[67899]: value = "task-3467895" [ 961.288074] env[67899]: _type = "Task" [ 961.288074] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 961.298823] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467895, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 961.798021] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467895, 'name': CreateVM_Task, 'duration_secs': 0.291672} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 961.799096] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 961.799096] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 961.799096] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 961.799510] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 961.799573] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2e94f2cf-a633-47bb-a267-2fea602a3500 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 961.804287] env[67899]: DEBUG oslo_vmware.api [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Waiting for the task: (returnval){ [ 961.804287] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52097c69-efa7-42b1-79f0-7f6e77dde3b3" [ 961.804287] env[67899]: _type = "Task" [ 961.804287] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 961.811851] env[67899]: DEBUG oslo_vmware.api [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52097c69-efa7-42b1-79f0-7f6e77dde3b3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 962.318262] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 962.318553] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 962.318798] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 962.844549] env[67899]: DEBUG nova.compute.manager [req-c4c9081a-30ac-411a-b611-b62a164ce91f req-6a23a0b1-de78-4445-8622-50261cfdfa6d service nova] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Received event network-changed-461efbd1-e524-4f43-81ae-0627f345720c {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 962.844737] env[67899]: DEBUG nova.compute.manager [req-c4c9081a-30ac-411a-b611-b62a164ce91f req-6a23a0b1-de78-4445-8622-50261cfdfa6d service nova] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Refreshing instance network info cache due to event network-changed-461efbd1-e524-4f43-81ae-0627f345720c. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 962.844945] env[67899]: DEBUG oslo_concurrency.lockutils [req-c4c9081a-30ac-411a-b611-b62a164ce91f req-6a23a0b1-de78-4445-8622-50261cfdfa6d service nova] Acquiring lock "refresh_cache-37ab08db-50ab-4c30-9e18-05007c5d1c27" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 962.845105] env[67899]: DEBUG oslo_concurrency.lockutils [req-c4c9081a-30ac-411a-b611-b62a164ce91f req-6a23a0b1-de78-4445-8622-50261cfdfa6d service nova] Acquired lock "refresh_cache-37ab08db-50ab-4c30-9e18-05007c5d1c27" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 962.849290] env[67899]: DEBUG nova.network.neutron [req-c4c9081a-30ac-411a-b611-b62a164ce91f req-6a23a0b1-de78-4445-8622-50261cfdfa6d service nova] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Refreshing network info cache for port 461efbd1-e524-4f43-81ae-0627f345720c {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 963.243359] env[67899]: DEBUG nova.network.neutron [req-c4c9081a-30ac-411a-b611-b62a164ce91f req-6a23a0b1-de78-4445-8622-50261cfdfa6d service nova] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Updated VIF entry in instance network info cache for port 461efbd1-e524-4f43-81ae-0627f345720c. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 963.243359] env[67899]: DEBUG nova.network.neutron [req-c4c9081a-30ac-411a-b611-b62a164ce91f req-6a23a0b1-de78-4445-8622-50261cfdfa6d service nova] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Updating instance_info_cache with network_info: [{"id": "461efbd1-e524-4f43-81ae-0627f345720c", "address": "fa:16:3e:83:f8:d0", "network": {"id": "f7b7998a-031a-4bb0-9556-ae33d16aa727", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-328690567-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2a1bb847e0034dcdb632882fb106d511", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4ad9ee0f-6a58-4a7b-bda3-5249b8cef84e", "external-id": "nsx-vlan-transportzone-354", "segmentation_id": 354, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap461efbd1-e5", "ovs_interfaceid": "461efbd1-e524-4f43-81ae-0627f345720c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 963.255998] env[67899]: DEBUG oslo_concurrency.lockutils [req-c4c9081a-30ac-411a-b611-b62a164ce91f req-6a23a0b1-de78-4445-8622-50261cfdfa6d service nova] Releasing lock "refresh_cache-37ab08db-50ab-4c30-9e18-05007c5d1c27" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 965.970342] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquiring lock "ec826735-4cc4-4847-8750-c5480e62134a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 965.971841] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "ec826735-4cc4-4847-8750-c5480e62134a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 966.347105] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquiring lock "37ab08db-50ab-4c30-9e18-05007c5d1c27" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 976.997109] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 976.997109] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 976.997449] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 977.008553] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 977.008755] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 977.008919] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 977.009088] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 977.010256] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-393067ac-53b8-4f7e-8df2-c763ddb07933 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.019454] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcf934c8-baf5-4b2f-9c08-678a282a712b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.034742] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f9986e2-647f-4c3a-888e-2d8040ac8882 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.040891] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9db31044-b57e-4e81-a70b-5cee190458f3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.069603] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180913MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 977.069754] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 977.069964] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 977.143398] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c29ae4c5-cc93-480c-8d60-96f6acba4346 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.143549] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.143673] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 913c5652-c8af-41a8-94f1-c0eba08aacdd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.143793] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.143912] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.144221] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.144418] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.144542] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.144657] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.144768] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 977.159660] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.172251] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4cbf5a4d-9466-4bc6-adc9-973759545cf4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.182918] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3fabbf48-5df3-4e36-a9d8-494c221304b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.194563] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7b6a4c60-1b40-44b8-b341-3dcaf1716c99 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.204642] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 183fd334-b0e1-479a-b38a-62f21c176d17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.216017] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.225976] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cce79170-e329-4d7a-ab2d-fa6605068897 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.235508] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4bd3cb98-1745-4c1a-8670-9849f70eb554 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.246042] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 0bde0bc7-8f34-4941-85f0-44fe5c67e398 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.256508] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b79e6007-10ac-4afe-a666-edef64685b22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.265612] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9143ce6-0592-4cff-a2a1-64874734b214 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.275146] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 868ae015-d365-4a42-8f5d-72faa796fa37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.284049] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a31cf212-7d4e-4f1c-b494-6b9739b2ef95 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.293264] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cb9d29cb-20ee-4875-b993-49cafed344d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.302806] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance eb285233-ef68-4426-827f-3320abe98cac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.312173] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.321178] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 977.321408] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 977.321555] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 977.618516] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a53e697-6616-4478-9f77-488143543134 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.626073] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8661df91-f11e-4f5b-b980-0e00beed7bc0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.655703] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7982bc69-b9f0-42c5-b0f1-dbd8b4208882 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.662619] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b662250-c5b2-4c79-b96f-f9fdda771650 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.675515] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 977.683878] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 977.697363] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 977.697363] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.627s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 978.691608] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 978.691874] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 978.691991] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 978.692126] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 978.714156] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 978.720058] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 978.720058] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 978.996011] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 978.996292] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 978.996446] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 981.997453] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1003.994819] env[67899]: WARNING oslo_vmware.rw_handles [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1003.994819] env[67899]: ERROR oslo_vmware.rw_handles [ 1003.995456] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/5ad9fda9-9b46-4709-91a3-efe3a0412786/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1003.998142] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1003.998395] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Copying Virtual Disk [datastore1] vmware_temp/5ad9fda9-9b46-4709-91a3-efe3a0412786/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/5ad9fda9-9b46-4709-91a3-efe3a0412786/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1003.998679] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e4bd9d82-1725-4ac4-ae45-27645defbc34 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.007169] env[67899]: DEBUG oslo_vmware.api [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Waiting for the task: (returnval){ [ 1004.007169] env[67899]: value = "task-3467896" [ 1004.007169] env[67899]: _type = "Task" [ 1004.007169] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1004.014862] env[67899]: DEBUG oslo_vmware.api [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Task: {'id': task-3467896, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1004.519192] env[67899]: DEBUG oslo_vmware.exceptions [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1004.519478] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1004.520046] env[67899]: ERROR nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1004.520046] env[67899]: Faults: ['InvalidArgument'] [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Traceback (most recent call last): [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] yield resources [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] self.driver.spawn(context, instance, image_meta, [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] self._fetch_image_if_missing(context, vi) [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] image_cache(vi, tmp_image_ds_loc) [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] vm_util.copy_virtual_disk( [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] session._wait_for_task(vmdk_copy_task) [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] return self.wait_for_task(task_ref) [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] return evt.wait() [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] result = hub.switch() [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] return self.greenlet.switch() [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] self.f(*self.args, **self.kw) [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] raise exceptions.translate_fault(task_info.error) [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Faults: ['InvalidArgument'] [ 1004.520046] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] [ 1004.520887] env[67899]: INFO nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Terminating instance [ 1004.521882] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1004.522099] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1004.522340] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-266dae93-d6b7-413f-8f82-639fe4845a4d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.524607] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1004.524794] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1004.525533] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84194bfb-9169-4454-b271-63887069dd61 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.532498] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1004.532720] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2a056a12-18b4-4a32-ae58-30d60d54cb33 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.534970] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1004.535153] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1004.536080] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4c23b7b9-2b2b-434d-a435-23ce3062851a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.542029] env[67899]: DEBUG oslo_vmware.api [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Waiting for the task: (returnval){ [ 1004.542029] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]520cd36f-5e08-5a01-f70c-4d0365d60938" [ 1004.542029] env[67899]: _type = "Task" [ 1004.542029] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1004.548027] env[67899]: DEBUG oslo_vmware.api [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]520cd36f-5e08-5a01-f70c-4d0365d60938, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1004.741092] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1004.741092] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1004.741092] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Deleting the datastore file [datastore1] 913c5652-c8af-41a8-94f1-c0eba08aacdd {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1004.741278] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4510a504-1378-4225-a889-2e4cda59190f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.748046] env[67899]: DEBUG oslo_vmware.api [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Waiting for the task: (returnval){ [ 1004.748046] env[67899]: value = "task-3467898" [ 1004.748046] env[67899]: _type = "Task" [ 1004.748046] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1004.755388] env[67899]: DEBUG oslo_vmware.api [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Task: {'id': task-3467898, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1005.051104] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1005.051388] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Creating directory with path [datastore1] vmware_temp/9ea035de-a5f3-404c-a1e0-c082cb43693d/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1005.051658] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-99b482ee-99da-4887-b05f-0c8fcca0674e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.062835] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Created directory with path [datastore1] vmware_temp/9ea035de-a5f3-404c-a1e0-c082cb43693d/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1005.063047] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Fetch image to [datastore1] vmware_temp/9ea035de-a5f3-404c-a1e0-c082cb43693d/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1005.063254] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/9ea035de-a5f3-404c-a1e0-c082cb43693d/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1005.063983] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-642c9510-e19d-4e87-942a-e67c346d8f6e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.070540] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5242be4d-3d9b-4644-bff0-39bc61bf566e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.079510] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9a86f7c-0fa5-4bae-87d9-13d79ef513cd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.110407] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce88b06f-5ff5-4b37-8ebd-c8d0c2818185 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.116721] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c388076a-2b74-4a54-9f08-f9932b5f06ef {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.137959] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1005.199289] env[67899]: DEBUG oslo_vmware.rw_handles [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9ea035de-a5f3-404c-a1e0-c082cb43693d/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1005.261908] env[67899]: DEBUG oslo_vmware.rw_handles [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1005.262316] env[67899]: DEBUG oslo_vmware.rw_handles [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9ea035de-a5f3-404c-a1e0-c082cb43693d/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1005.266068] env[67899]: DEBUG oslo_vmware.api [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Task: {'id': task-3467898, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068297} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1005.267023] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1005.267023] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1005.267023] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1005.267023] env[67899]: INFO nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Took 0.74 seconds to destroy the instance on the hypervisor. [ 1005.269052] env[67899]: DEBUG nova.compute.claims [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1005.269223] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1005.269430] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1005.595619] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6450391-c819-45ee-aa38-e6c840084ce6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.603486] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0173d587-e860-480e-9a12-222d8dd1b5f8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.634472] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2a8c444-93dd-4c28-9b70-d52d8d10b5bb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.641891] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-152a77e0-d3e6-408d-b519-99ec20df5037 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.655160] env[67899]: DEBUG nova.compute.provider_tree [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1005.665370] env[67899]: DEBUG nova.scheduler.client.report [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1005.678999] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.409s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1005.679544] env[67899]: ERROR nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1005.679544] env[67899]: Faults: ['InvalidArgument'] [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Traceback (most recent call last): [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] self.driver.spawn(context, instance, image_meta, [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] self._fetch_image_if_missing(context, vi) [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] image_cache(vi, tmp_image_ds_loc) [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] vm_util.copy_virtual_disk( [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] session._wait_for_task(vmdk_copy_task) [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] return self.wait_for_task(task_ref) [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] return evt.wait() [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] result = hub.switch() [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] return self.greenlet.switch() [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] self.f(*self.args, **self.kw) [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] raise exceptions.translate_fault(task_info.error) [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Faults: ['InvalidArgument'] [ 1005.679544] env[67899]: ERROR nova.compute.manager [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] [ 1005.680342] env[67899]: DEBUG nova.compute.utils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1005.681732] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Build of instance 913c5652-c8af-41a8-94f1-c0eba08aacdd was re-scheduled: A specified parameter was not correct: fileType [ 1005.681732] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1005.682112] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1005.682284] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1005.682451] env[67899]: DEBUG nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1005.682615] env[67899]: DEBUG nova.network.neutron [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1006.056981] env[67899]: DEBUG nova.network.neutron [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1006.068345] env[67899]: INFO nova.compute.manager [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Took 0.39 seconds to deallocate network for instance. [ 1006.174312] env[67899]: INFO nova.scheduler.client.report [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Deleted allocations for instance 913c5652-c8af-41a8-94f1-c0eba08aacdd [ 1006.198091] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1a665adc-0197-431c-b7d3-cbe5f770e3d4 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "913c5652-c8af-41a8-94f1-c0eba08aacdd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 377.506s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1006.199569] env[67899]: DEBUG oslo_concurrency.lockutils [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "913c5652-c8af-41a8-94f1-c0eba08aacdd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 178.292s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1006.199803] env[67899]: DEBUG oslo_concurrency.lockutils [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Acquiring lock "913c5652-c8af-41a8-94f1-c0eba08aacdd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1006.200014] env[67899]: DEBUG oslo_concurrency.lockutils [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "913c5652-c8af-41a8-94f1-c0eba08aacdd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1006.200197] env[67899]: DEBUG oslo_concurrency.lockutils [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "913c5652-c8af-41a8-94f1-c0eba08aacdd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1006.202122] env[67899]: INFO nova.compute.manager [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Terminating instance [ 1006.203810] env[67899]: DEBUG nova.compute.manager [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1006.204084] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1006.204798] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a7475e7e-7ed6-40b2-bd7f-27cd5dbfd2fe {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.212442] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1006.217242] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce419063-d54c-4ea2-8d61-af1d4d661546 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.247030] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 913c5652-c8af-41a8-94f1-c0eba08aacdd could not be found. [ 1006.247030] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1006.247030] env[67899]: INFO nova.compute.manager [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1006.247147] env[67899]: DEBUG oslo.service.loopingcall [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1006.247381] env[67899]: DEBUG nova.compute.manager [-] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1006.247847] env[67899]: DEBUG nova.network.neutron [-] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1006.268099] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1006.268347] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1006.269809] env[67899]: INFO nova.compute.claims [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1006.285663] env[67899]: DEBUG nova.network.neutron [-] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1006.293790] env[67899]: INFO nova.compute.manager [-] [instance: 913c5652-c8af-41a8-94f1-c0eba08aacdd] Took 0.05 seconds to deallocate network for instance. [ 1006.430570] env[67899]: DEBUG oslo_concurrency.lockutils [None req-759dcb41-7bb0-4571-a61c-fe945b232dc3 tempest-FloatingIPsAssociationTestJSON-1366579974 tempest-FloatingIPsAssociationTestJSON-1366579974-project-member] Lock "913c5652-c8af-41a8-94f1-c0eba08aacdd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.231s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1006.650548] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9666514e-cad0-4775-9491-543d5b2e6a52 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.658186] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eaa55faf-f272-4a30-9a16-6bf6399a13d4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.690010] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a03cac4-9d58-40c1-8d78-bc9dd8272c4b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.697216] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04a8f6a7-04f1-4a99-86a2-cb6a5ae6c5a9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.710305] env[67899]: DEBUG nova.compute.provider_tree [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1006.718153] env[67899]: DEBUG nova.scheduler.client.report [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1006.731269] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.463s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1006.731743] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1006.765463] env[67899]: DEBUG nova.compute.utils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1006.766878] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1006.767060] env[67899]: DEBUG nova.network.neutron [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1006.778216] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1006.843983] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1006.872816] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1006.873481] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1006.873481] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1006.873481] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1006.873601] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1006.873738] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1006.873941] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1006.874205] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1006.874365] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1006.874538] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1006.874710] env[67899]: DEBUG nova.virt.hardware [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1006.875585] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08c26b7a-9446-40d6-83f3-fca56cb95e5d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.886064] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b7697d8-b35f-4034-955d-a384a8460523 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1007.048571] env[67899]: DEBUG nova.policy [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8a4b8ff0472f4a1387b14e093ae81b56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '564f52a655fd4628be7e299552d47772', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1007.496035] env[67899]: DEBUG nova.network.neutron [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Successfully created port: b48e27d8-eb97-4ea6-9464-0f5c1ca213d4 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1008.748504] env[67899]: DEBUG nova.compute.manager [req-c4af99fc-9508-4969-aba4-2a0042a47c39 req-5b545db8-762b-49e3-8802-3feb14f845c4 service nova] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Received event network-vif-plugged-b48e27d8-eb97-4ea6-9464-0f5c1ca213d4 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1008.749147] env[67899]: DEBUG oslo_concurrency.lockutils [req-c4af99fc-9508-4969-aba4-2a0042a47c39 req-5b545db8-762b-49e3-8802-3feb14f845c4 service nova] Acquiring lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1008.749147] env[67899]: DEBUG oslo_concurrency.lockutils [req-c4af99fc-9508-4969-aba4-2a0042a47c39 req-5b545db8-762b-49e3-8802-3feb14f845c4 service nova] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1008.749147] env[67899]: DEBUG oslo_concurrency.lockutils [req-c4af99fc-9508-4969-aba4-2a0042a47c39 req-5b545db8-762b-49e3-8802-3feb14f845c4 service nova] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1008.749147] env[67899]: DEBUG nova.compute.manager [req-c4af99fc-9508-4969-aba4-2a0042a47c39 req-5b545db8-762b-49e3-8802-3feb14f845c4 service nova] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] No waiting events found dispatching network-vif-plugged-b48e27d8-eb97-4ea6-9464-0f5c1ca213d4 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1008.749328] env[67899]: WARNING nova.compute.manager [req-c4af99fc-9508-4969-aba4-2a0042a47c39 req-5b545db8-762b-49e3-8802-3feb14f845c4 service nova] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Received unexpected event network-vif-plugged-b48e27d8-eb97-4ea6-9464-0f5c1ca213d4 for instance with vm_state building and task_state spawning. [ 1008.912276] env[67899]: DEBUG nova.network.neutron [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Successfully updated port: b48e27d8-eb97-4ea6-9464-0f5c1ca213d4 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1008.932697] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "refresh_cache-4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1008.932697] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquired lock "refresh_cache-4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1008.932697] env[67899]: DEBUG nova.network.neutron [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1009.020425] env[67899]: DEBUG nova.network.neutron [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1009.418443] env[67899]: DEBUG nova.network.neutron [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Updating instance_info_cache with network_info: [{"id": "b48e27d8-eb97-4ea6-9464-0f5c1ca213d4", "address": "fa:16:3e:06:9f:c9", "network": {"id": "d0839757-d0e4-4910-a0fa-07a195b64a66", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1313326550-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "564f52a655fd4628be7e299552d47772", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0ef5aba-bd9a-42ff-a1a0-5e763986d70a", "external-id": "nsx-vlan-transportzone-209", "segmentation_id": 209, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb48e27d8-eb", "ovs_interfaceid": "b48e27d8-eb97-4ea6-9464-0f5c1ca213d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1009.437099] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Releasing lock "refresh_cache-4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1009.437416] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Instance network_info: |[{"id": "b48e27d8-eb97-4ea6-9464-0f5c1ca213d4", "address": "fa:16:3e:06:9f:c9", "network": {"id": "d0839757-d0e4-4910-a0fa-07a195b64a66", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1313326550-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "564f52a655fd4628be7e299552d47772", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0ef5aba-bd9a-42ff-a1a0-5e763986d70a", "external-id": "nsx-vlan-transportzone-209", "segmentation_id": 209, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb48e27d8-eb", "ovs_interfaceid": "b48e27d8-eb97-4ea6-9464-0f5c1ca213d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1009.437801] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:06:9f:c9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f0ef5aba-bd9a-42ff-a1a0-5e763986d70a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b48e27d8-eb97-4ea6-9464-0f5c1ca213d4', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1009.445919] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Creating folder: Project (564f52a655fd4628be7e299552d47772). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1009.446617] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e57965b9-5353-4c4c-90ba-b25ab6aa39ca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.460363] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Created folder: Project (564f52a655fd4628be7e299552d47772) in parent group-v692900. [ 1009.460641] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Creating folder: Instances. Parent ref: group-v692956. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1009.460920] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b166b616-4e42-4ef5-a13c-1de6ce5c94d4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.472910] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Created folder: Instances in parent group-v692956. [ 1009.473154] env[67899]: DEBUG oslo.service.loopingcall [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1009.473384] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1009.473550] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5e2e7813-f90b-4375-bb54-796ea82636e0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.493308] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1009.493308] env[67899]: value = "task-3467901" [ 1009.493308] env[67899]: _type = "Task" [ 1009.493308] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1009.502300] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467901, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1010.007615] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467901, 'name': CreateVM_Task, 'duration_secs': 0.310684} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1010.007615] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1010.007615] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1010.008488] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1010.010300] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1010.010300] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d2a6ac24-a08a-4287-be8a-62d047ad74df {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.017645] env[67899]: DEBUG oslo_vmware.api [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Waiting for the task: (returnval){ [ 1010.017645] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52a8d625-1d49-ce39-d162-7951fe8d7e13" [ 1010.017645] env[67899]: _type = "Task" [ 1010.017645] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1010.026399] env[67899]: DEBUG oslo_vmware.api [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52a8d625-1d49-ce39-d162-7951fe8d7e13, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1010.526394] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1010.526663] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1010.526877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1010.918530] env[67899]: DEBUG nova.compute.manager [req-6ef67bf4-43ea-41fc-942c-c2361c082de2 req-7b857b4a-8ab5-49ad-bbf9-a593972fb879 service nova] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Received event network-changed-b48e27d8-eb97-4ea6-9464-0f5c1ca213d4 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1010.918728] env[67899]: DEBUG nova.compute.manager [req-6ef67bf4-43ea-41fc-942c-c2361c082de2 req-7b857b4a-8ab5-49ad-bbf9-a593972fb879 service nova] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Refreshing instance network info cache due to event network-changed-b48e27d8-eb97-4ea6-9464-0f5c1ca213d4. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1010.918935] env[67899]: DEBUG oslo_concurrency.lockutils [req-6ef67bf4-43ea-41fc-942c-c2361c082de2 req-7b857b4a-8ab5-49ad-bbf9-a593972fb879 service nova] Acquiring lock "refresh_cache-4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1010.919098] env[67899]: DEBUG oslo_concurrency.lockutils [req-6ef67bf4-43ea-41fc-942c-c2361c082de2 req-7b857b4a-8ab5-49ad-bbf9-a593972fb879 service nova] Acquired lock "refresh_cache-4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1010.919235] env[67899]: DEBUG nova.network.neutron [req-6ef67bf4-43ea-41fc-942c-c2361c082de2 req-7b857b4a-8ab5-49ad-bbf9-a593972fb879 service nova] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Refreshing network info cache for port b48e27d8-eb97-4ea6-9464-0f5c1ca213d4 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1011.302180] env[67899]: DEBUG nova.network.neutron [req-6ef67bf4-43ea-41fc-942c-c2361c082de2 req-7b857b4a-8ab5-49ad-bbf9-a593972fb879 service nova] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Updated VIF entry in instance network info cache for port b48e27d8-eb97-4ea6-9464-0f5c1ca213d4. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1011.302600] env[67899]: DEBUG nova.network.neutron [req-6ef67bf4-43ea-41fc-942c-c2361c082de2 req-7b857b4a-8ab5-49ad-bbf9-a593972fb879 service nova] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Updating instance_info_cache with network_info: [{"id": "b48e27d8-eb97-4ea6-9464-0f5c1ca213d4", "address": "fa:16:3e:06:9f:c9", "network": {"id": "d0839757-d0e4-4910-a0fa-07a195b64a66", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1313326550-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "564f52a655fd4628be7e299552d47772", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0ef5aba-bd9a-42ff-a1a0-5e763986d70a", "external-id": "nsx-vlan-transportzone-209", "segmentation_id": 209, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb48e27d8-eb", "ovs_interfaceid": "b48e27d8-eb97-4ea6-9464-0f5c1ca213d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1011.313920] env[67899]: DEBUG oslo_concurrency.lockutils [req-6ef67bf4-43ea-41fc-942c-c2361c082de2 req-7b857b4a-8ab5-49ad-bbf9-a593972fb879 service nova] Releasing lock "refresh_cache-4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1012.803042] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "c7ad553b-2149-4211-aee3-057ea83069f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1012.803379] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "c7ad553b-2149-4211-aee3-057ea83069f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1012.996507] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d5b8e164-f48b-45b4-a730-66d8dd108eb7 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "b9df90e1-da9a-47c3-8920-84f20ef5c588" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1012.996859] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d5b8e164-f48b-45b4-a730-66d8dd108eb7 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "b9df90e1-da9a-47c3-8920-84f20ef5c588" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.365070] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1019.930017] env[67899]: DEBUG oslo_concurrency.lockutils [None req-faf2a278-e187-4285-9418-771dae793d05 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquiring lock "9b4ec0f5-35d7-4ba9-bc46-47cd2a73219c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1019.930341] env[67899]: DEBUG oslo_concurrency.lockutils [None req-faf2a278-e187-4285-9418-771dae793d05 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "9b4ec0f5-35d7-4ba9-bc46-47cd2a73219c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1027.093646] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9561e73b-0abb-4ff8-958b-bc925c7916af tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquiring lock "ce0c59ed-7bb2-49cc-a158-dda0da4f88cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1027.093646] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9561e73b-0abb-4ff8-958b-bc925c7916af tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "ce0c59ed-7bb2-49cc-a158-dda0da4f88cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1032.694790] env[67899]: DEBUG oslo_concurrency.lockutils [None req-06314e14-3ed8-4bad-9823-95f7a4342101 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquiring lock "db21b229-2664-4947-96c8-c1e92f97917e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1032.695078] env[67899]: DEBUG oslo_concurrency.lockutils [None req-06314e14-3ed8-4bad-9823-95f7a4342101 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "db21b229-2664-4947-96c8-c1e92f97917e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1037.998985] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1037.999271] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1038.991548] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1038.991818] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1039.031158] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1039.031420] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1039.031475] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1039.060035] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.060258] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.060398] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.060526] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.060648] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.060766] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.060883] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.061012] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.061161] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.061278] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1039.061394] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1039.061863] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1039.075007] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1039.079017] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1039.079017] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1039.079017] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1039.079017] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94fc7d88-2c47-4943-87ad-8884c7790f5b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.085950] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2004a334-8688-43cb-9c6a-9ff3e2090791 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.102338] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c370068d-c20c-4dff-8d5c-a97514267595 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.109151] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-add2ef12-44c7-498f-8f13-23f4f34bf6bf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.139599] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180885MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1039.139759] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1039.140013] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1039.235831] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c29ae4c5-cc93-480c-8d60-96f6acba4346 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.236014] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.237134] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.237134] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.237134] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.237134] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.237134] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.237134] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.237134] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.237134] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1039.260441] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7b6a4c60-1b40-44b8-b341-3dcaf1716c99 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.273604] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 183fd334-b0e1-479a-b38a-62f21c176d17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.286812] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.302822] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cce79170-e329-4d7a-ab2d-fa6605068897 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.314010] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4bd3cb98-1745-4c1a-8670-9849f70eb554 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.327113] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 0bde0bc7-8f34-4941-85f0-44fe5c67e398 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.339159] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b79e6007-10ac-4afe-a666-edef64685b22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.351085] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9143ce6-0592-4cff-a2a1-64874734b214 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.367836] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 868ae015-d365-4a42-8f5d-72faa796fa37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.379746] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a31cf212-7d4e-4f1c-b494-6b9739b2ef95 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.398108] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cb9d29cb-20ee-4875-b993-49cafed344d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.406298] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance eb285233-ef68-4426-827f-3320abe98cac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.426314] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.439546] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.451587] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.464224] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9df90e1-da9a-47c3-8920-84f20ef5c588 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.479341] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4ec0f5-35d7-4ba9-bc46-47cd2a73219c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.520400] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ce0c59ed-7bb2-49cc-a158-dda0da4f88cf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.534549] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance db21b229-2664-4947-96c8-c1e92f97917e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1039.534784] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1039.537017] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1039.924178] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db7659e0-d2fe-45ff-81be-92ef27bc3761 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.932777] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dfc6875-bb98-4758-9432-e851d701ab7c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.962174] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-325126ef-ac81-4932-a203-53588834d14c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.969565] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2348f33b-c54d-4bba-a2f2-6f262b4aa3c4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1039.982647] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1039.992498] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1040.014587] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1040.014587] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.874s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1040.948210] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1040.948564] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1040.996119] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1040.996298] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1041.915837] env[67899]: DEBUG oslo_concurrency.lockutils [None req-90e84af9-2c10-4820-98ac-ee806bc146c3 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "94ebdda8-5b9c-4ffa-be45-571ec9ba9f81" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1041.915970] env[67899]: DEBUG oslo_concurrency.lockutils [None req-90e84af9-2c10-4820-98ac-ee806bc146c3 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "94ebdda8-5b9c-4ffa-be45-571ec9ba9f81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1041.996786] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1045.022193] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ded5771e-a4f5-4a81-924b-bdb96277cb6f tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] Acquiring lock "928c018d-ec75-42c6-8e55-e38bb5947bcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1045.022552] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ded5771e-a4f5-4a81-924b-bdb96277cb6f tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] Lock "928c018d-ec75-42c6-8e55-e38bb5947bcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1047.920686] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1f3aed5d-5130-4849-8586-e54f4f6d3927 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "04bee4b3-88b9-4f8c-b5d7-3955a158a2d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1047.921055] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1f3aed5d-5130-4849-8586-e54f4f6d3927 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "04bee4b3-88b9-4f8c-b5d7-3955a158a2d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1051.994083] env[67899]: WARNING oslo_vmware.rw_handles [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1051.994083] env[67899]: ERROR oslo_vmware.rw_handles [ 1051.994881] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/9ea035de-a5f3-404c-a1e0-c082cb43693d/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1051.999030] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1051.999030] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Copying Virtual Disk [datastore1] vmware_temp/9ea035de-a5f3-404c-a1e0-c082cb43693d/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/9ea035de-a5f3-404c-a1e0-c082cb43693d/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1051.999030] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d0beb593-d821-4e5c-a811-45e781d5507f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.006132] env[67899]: DEBUG oslo_vmware.api [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Waiting for the task: (returnval){ [ 1052.006132] env[67899]: value = "task-3467902" [ 1052.006132] env[67899]: _type = "Task" [ 1052.006132] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1052.014062] env[67899]: DEBUG oslo_vmware.api [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Task: {'id': task-3467902, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1052.146044] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bf08a2ad-94a1-4a30-a63d-7b81c98afb6a tempest-ServerActionsTestOtherA-1954250680 tempest-ServerActionsTestOtherA-1954250680-project-member] Acquiring lock "a43ea307-5b84-4c8c-9f28-255980bfd51a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1052.146044] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bf08a2ad-94a1-4a30-a63d-7b81c98afb6a tempest-ServerActionsTestOtherA-1954250680 tempest-ServerActionsTestOtherA-1954250680-project-member] Lock "a43ea307-5b84-4c8c-9f28-255980bfd51a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1052.516601] env[67899]: DEBUG oslo_vmware.exceptions [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1052.516953] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1052.517552] env[67899]: ERROR nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1052.517552] env[67899]: Faults: ['InvalidArgument'] [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Traceback (most recent call last): [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] yield resources [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] self.driver.spawn(context, instance, image_meta, [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] self._fetch_image_if_missing(context, vi) [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] image_cache(vi, tmp_image_ds_loc) [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] vm_util.copy_virtual_disk( [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] session._wait_for_task(vmdk_copy_task) [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] return self.wait_for_task(task_ref) [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] return evt.wait() [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] result = hub.switch() [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] return self.greenlet.switch() [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] self.f(*self.args, **self.kw) [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] raise exceptions.translate_fault(task_info.error) [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Faults: ['InvalidArgument'] [ 1052.517552] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] [ 1052.518356] env[67899]: INFO nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Terminating instance [ 1052.519652] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1052.519896] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1052.520548] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1052.520784] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1052.521065] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-32b48415-b798-4175-b682-a98940408737 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.524195] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-542c5a20-2a15-4ca9-80fd-7b5240a922f5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.532892] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1052.533141] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7728a7f3-612a-4112-9c36-08593f277efd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.535311] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1052.535574] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1052.536347] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-22d920d8-181a-440a-aa90-7bce1699393e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.541702] env[67899]: DEBUG oslo_vmware.api [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Waiting for the task: (returnval){ [ 1052.541702] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5293ff3e-ee39-030f-ec91-ad7dcdeb483f" [ 1052.541702] env[67899]: _type = "Task" [ 1052.541702] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1052.549514] env[67899]: DEBUG oslo_vmware.api [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5293ff3e-ee39-030f-ec91-ad7dcdeb483f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1052.601539] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1052.601768] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1052.601946] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Deleting the datastore file [datastore1] 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1052.602238] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bfe0ee34-b0b1-4ef3-aaf5-c8670f7c1c48 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.609144] env[67899]: DEBUG oslo_vmware.api [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Waiting for the task: (returnval){ [ 1052.609144] env[67899]: value = "task-3467904" [ 1052.609144] env[67899]: _type = "Task" [ 1052.609144] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1052.619923] env[67899]: DEBUG oslo_vmware.api [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Task: {'id': task-3467904, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1053.052377] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1053.052670] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Creating directory with path [datastore1] vmware_temp/2b1923a6-263e-4c86-be16-b1e2f8087fbe/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1053.052886] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a4963d3b-eb10-4c96-b273-43035733bb00 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.064592] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Created directory with path [datastore1] vmware_temp/2b1923a6-263e-4c86-be16-b1e2f8087fbe/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1053.064793] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Fetch image to [datastore1] vmware_temp/2b1923a6-263e-4c86-be16-b1e2f8087fbe/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1053.064962] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/2b1923a6-263e-4c86-be16-b1e2f8087fbe/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1053.065734] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d373e115-9049-46af-83b0-84f14c31b1a1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.072657] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-600fe46b-ed4a-4cd4-8f64-6f74c00a9c43 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.085831] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-485226e6-a729-4930-9441-a7f2504e20c4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.119548] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b254b5ef-0def-4c88-9e83-72c796670464 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.126611] env[67899]: DEBUG oslo_vmware.api [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Task: {'id': task-3467904, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080413} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1053.128080] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1053.128273] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1053.128440] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1053.128612] env[67899]: INFO nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1053.131062] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e7035563-1ba5-4f1c-b2cf-9e2dd52d1c58 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.132561] env[67899]: DEBUG nova.compute.claims [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1053.132635] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1053.132842] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1053.157402] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1053.236872] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2b1923a6-263e-4c86-be16-b1e2f8087fbe/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1053.306400] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1053.306723] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2b1923a6-263e-4c86-be16-b1e2f8087fbe/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1053.695309] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc8b4657-7111-4e69-95df-d3287728f853 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.704051] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d61b14c-1efb-41d7-a798-5aac01a3918c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.733734] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c45fe676-9e4a-4fe7-9a80-433aaf2d9b7f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.740756] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdce9e41-8fcf-49c1-bb05-832b1125d584 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.754028] env[67899]: DEBUG nova.compute.provider_tree [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1053.763021] env[67899]: DEBUG nova.scheduler.client.report [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1053.776564] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.644s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1053.777120] env[67899]: ERROR nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1053.777120] env[67899]: Faults: ['InvalidArgument'] [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Traceback (most recent call last): [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] self.driver.spawn(context, instance, image_meta, [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] self._fetch_image_if_missing(context, vi) [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] image_cache(vi, tmp_image_ds_loc) [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] vm_util.copy_virtual_disk( [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] session._wait_for_task(vmdk_copy_task) [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] return self.wait_for_task(task_ref) [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] return evt.wait() [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] result = hub.switch() [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] return self.greenlet.switch() [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] self.f(*self.args, **self.kw) [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] raise exceptions.translate_fault(task_info.error) [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Faults: ['InvalidArgument'] [ 1053.777120] env[67899]: ERROR nova.compute.manager [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] [ 1053.777995] env[67899]: DEBUG nova.compute.utils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1053.779396] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Build of instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda was re-scheduled: A specified parameter was not correct: fileType [ 1053.779396] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1053.779953] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1053.780219] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1053.780415] env[67899]: DEBUG nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1053.780580] env[67899]: DEBUG nova.network.neutron [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1054.125027] env[67899]: DEBUG nova.network.neutron [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1054.138307] env[67899]: INFO nova.compute.manager [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Took 0.36 seconds to deallocate network for instance. [ 1054.236544] env[67899]: INFO nova.scheduler.client.report [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Deleted allocations for instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda [ 1054.258461] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3d107220-4fe1-4bd4-b86d-3bd578f6f6b3 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 427.279s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.259813] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 228.623s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1054.260050] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Acquiring lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1054.260254] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1054.260701] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.262443] env[67899]: INFO nova.compute.manager [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Terminating instance [ 1054.267469] env[67899]: DEBUG nova.compute.manager [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1054.268067] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1054.268154] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8df910ea-6d32-499b-98a0-8bcaca8dd888 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1054.274064] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4cbf5a4d-9466-4bc6-adc9-973759545cf4] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1054.280672] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4abeb29e-a9b6-42ca-a565-28c48079f17e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1054.299683] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4cbf5a4d-9466-4bc6-adc9-973759545cf4] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1054.310280] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda could not be found. [ 1054.310491] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1054.310664] env[67899]: INFO nova.compute.manager [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1054.310903] env[67899]: DEBUG oslo.service.loopingcall [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1054.311153] env[67899]: DEBUG nova.compute.manager [-] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1054.311254] env[67899]: DEBUG nova.network.neutron [-] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1054.340731] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "4cbf5a4d-9466-4bc6-adc9-973759545cf4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.150s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.349871] env[67899]: DEBUG nova.network.neutron [-] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1054.359774] env[67899]: INFO nova.compute.manager [-] [instance: 9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda] Took 0.05 seconds to deallocate network for instance. [ 1054.364968] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 3fabbf48-5df3-4e36-a9d8-494c221304b1] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1054.413727] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 3fabbf48-5df3-4e36-a9d8-494c221304b1] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1054.438969] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "3fabbf48-5df3-4e36-a9d8-494c221304b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.224s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.449550] env[67899]: DEBUG nova.compute.manager [None req-2d2f0f85-9ccf-46fc-88e6-864a4167c672 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: 7b6a4c60-1b40-44b8-b341-3dcaf1716c99] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1054.481758] env[67899]: DEBUG nova.compute.manager [None req-2d2f0f85-9ccf-46fc-88e6-864a4167c672 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: 7b6a4c60-1b40-44b8-b341-3dcaf1716c99] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1054.488617] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eab4a7d0-f125-4f7b-a1c8-0173bffaceb5 tempest-VolumesAssistedSnapshotsTest-398519601 tempest-VolumesAssistedSnapshotsTest-398519601-project-member] Lock "9085d0d1-bb7f-4bdd-b6a7-a550fe97ffda" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.229s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.507988] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2d2f0f85-9ccf-46fc-88e6-864a4167c672 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "7b6a4c60-1b40-44b8-b341-3dcaf1716c99" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.387s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.519156] env[67899]: DEBUG nova.compute.manager [None req-2eea373d-45cd-4959-b8b6-32ef12d95a25 tempest-ImagesOneServerNegativeTestJSON-1534430733 tempest-ImagesOneServerNegativeTestJSON-1534430733-project-member] [instance: 183fd334-b0e1-479a-b38a-62f21c176d17] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1054.543695] env[67899]: DEBUG nova.compute.manager [None req-2eea373d-45cd-4959-b8b6-32ef12d95a25 tempest-ImagesOneServerNegativeTestJSON-1534430733 tempest-ImagesOneServerNegativeTestJSON-1534430733-project-member] [instance: 183fd334-b0e1-479a-b38a-62f21c176d17] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1054.567806] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2eea373d-45cd-4959-b8b6-32ef12d95a25 tempest-ImagesOneServerNegativeTestJSON-1534430733 tempest-ImagesOneServerNegativeTestJSON-1534430733-project-member] Lock "183fd334-b0e1-479a-b38a-62f21c176d17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.367s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.578597] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1054.647089] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1054.647342] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1054.648973] env[67899]: INFO nova.compute.claims [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1055.020662] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fe634df-dc8a-4c7c-9cb6-d598a1ad6ede {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1055.028416] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22fb2a2f-a4c0-4c2a-bd4e-5087f5ad9e0f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1055.060365] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aeef842a-bc2f-414f-a834-d5d7f50f869d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1055.068448] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e8819c3-a55e-49c8-b7dd-d0ff1945d4ff {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1055.082658] env[67899]: DEBUG nova.compute.provider_tree [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1055.091380] env[67899]: DEBUG nova.scheduler.client.report [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1055.109423] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.462s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1055.109938] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1055.147276] env[67899]: DEBUG nova.compute.utils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1055.149021] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1055.149021] env[67899]: DEBUG nova.network.neutron [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1055.157654] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1055.236629] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1055.247818] env[67899]: DEBUG nova.policy [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '68e7483fb2504e15b4d282574fb30052', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0230c8cd26234d5bb08064361fe78ad5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1055.259145] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1055.259389] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1055.259544] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1055.259722] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1055.259863] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1055.260010] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1055.260276] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1055.260434] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1055.260598] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1055.260757] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1055.260995] env[67899]: DEBUG nova.virt.hardware [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1055.261844] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebc725f2-0da8-4288-a54b-aed224550a5f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1055.270504] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dfec5f4-2092-455b-a8c5-99fbb5cac48b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1055.668104] env[67899]: DEBUG nova.network.neutron [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Successfully created port: d989bd9c-71bc-401e-951f-522fbd4539f1 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1056.050520] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquiring lock "b9282eeb-09db-4138-a1f0-9e03828021b8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1056.303209] env[67899]: DEBUG nova.compute.manager [req-426e9728-8954-427a-b861-e3f4b58098c7 req-d4c5bf89-4d36-4d3d-bb07-a5280393c539 service nova] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Received event network-vif-plugged-d989bd9c-71bc-401e-951f-522fbd4539f1 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1056.303209] env[67899]: DEBUG oslo_concurrency.lockutils [req-426e9728-8954-427a-b861-e3f4b58098c7 req-d4c5bf89-4d36-4d3d-bb07-a5280393c539 service nova] Acquiring lock "b9282eeb-09db-4138-a1f0-9e03828021b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1056.303209] env[67899]: DEBUG oslo_concurrency.lockutils [req-426e9728-8954-427a-b861-e3f4b58098c7 req-d4c5bf89-4d36-4d3d-bb07-a5280393c539 service nova] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1056.303209] env[67899]: DEBUG oslo_concurrency.lockutils [req-426e9728-8954-427a-b861-e3f4b58098c7 req-d4c5bf89-4d36-4d3d-bb07-a5280393c539 service nova] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1056.303209] env[67899]: DEBUG nova.compute.manager [req-426e9728-8954-427a-b861-e3f4b58098c7 req-d4c5bf89-4d36-4d3d-bb07-a5280393c539 service nova] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] No waiting events found dispatching network-vif-plugged-d989bd9c-71bc-401e-951f-522fbd4539f1 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1056.303982] env[67899]: WARNING nova.compute.manager [req-426e9728-8954-427a-b861-e3f4b58098c7 req-d4c5bf89-4d36-4d3d-bb07-a5280393c539 service nova] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Received unexpected event network-vif-plugged-d989bd9c-71bc-401e-951f-522fbd4539f1 for instance with vm_state building and task_state deleting. [ 1056.410207] env[67899]: DEBUG nova.network.neutron [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Successfully updated port: d989bd9c-71bc-401e-951f-522fbd4539f1 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1056.422697] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquiring lock "refresh_cache-b9282eeb-09db-4138-a1f0-9e03828021b8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1056.422888] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquired lock "refresh_cache-b9282eeb-09db-4138-a1f0-9e03828021b8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1056.423077] env[67899]: DEBUG nova.network.neutron [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1056.468047] env[67899]: DEBUG nova.network.neutron [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1056.678195] env[67899]: DEBUG nova.network.neutron [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Updating instance_info_cache with network_info: [{"id": "d989bd9c-71bc-401e-951f-522fbd4539f1", "address": "fa:16:3e:26:04:cb", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd989bd9c-71", "ovs_interfaceid": "d989bd9c-71bc-401e-951f-522fbd4539f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1056.696971] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Releasing lock "refresh_cache-b9282eeb-09db-4138-a1f0-9e03828021b8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1056.697488] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Instance network_info: |[{"id": "d989bd9c-71bc-401e-951f-522fbd4539f1", "address": "fa:16:3e:26:04:cb", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd989bd9c-71", "ovs_interfaceid": "d989bd9c-71bc-401e-951f-522fbd4539f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1056.698381] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:26:04:cb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd989bd9c-71bc-401e-951f-522fbd4539f1', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1056.710798] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Creating folder: Project (0230c8cd26234d5bb08064361fe78ad5). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1056.711605] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a4d70e76-bc71-46cf-8c46-212cdc71961d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.723540] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Created folder: Project (0230c8cd26234d5bb08064361fe78ad5) in parent group-v692900. [ 1056.723869] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Creating folder: Instances. Parent ref: group-v692959. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1056.724263] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-79674c34-38f4-410f-8fd2-ff50b22b9b86 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.735539] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Created folder: Instances in parent group-v692959. [ 1056.735690] env[67899]: DEBUG oslo.service.loopingcall [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1056.735879] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1056.736109] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bd64f500-1319-4755-b4a1-bf9dc916a72c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.755868] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1056.755868] env[67899]: value = "task-3467907" [ 1056.755868] env[67899]: _type = "Task" [ 1056.755868] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1056.764053] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467907, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1057.268757] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467907, 'name': CreateVM_Task, 'duration_secs': 0.299904} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1057.268964] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1057.270223] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1057.270442] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1057.270759] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1057.271027] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dae068f9-c9ca-43d2-98c2-25e7d1fcae3f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.276350] env[67899]: DEBUG oslo_vmware.api [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Waiting for the task: (returnval){ [ 1057.276350] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5241fc9d-5c7d-2bbd-c4e1-8ab1a582276b" [ 1057.276350] env[67899]: _type = "Task" [ 1057.276350] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1057.284120] env[67899]: DEBUG oslo_vmware.api [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5241fc9d-5c7d-2bbd-c4e1-8ab1a582276b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1057.788745] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1057.789167] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1057.789494] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1058.369513] env[67899]: DEBUG nova.compute.manager [req-b3aa5ca1-d6b9-4325-91d4-a98c10d7902b req-1239e2c5-e063-4e80-8b8e-b273151ec25c service nova] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Received event network-changed-d989bd9c-71bc-401e-951f-522fbd4539f1 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1058.369919] env[67899]: DEBUG nova.compute.manager [req-b3aa5ca1-d6b9-4325-91d4-a98c10d7902b req-1239e2c5-e063-4e80-8b8e-b273151ec25c service nova] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Refreshing instance network info cache due to event network-changed-d989bd9c-71bc-401e-951f-522fbd4539f1. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1058.369966] env[67899]: DEBUG oslo_concurrency.lockutils [req-b3aa5ca1-d6b9-4325-91d4-a98c10d7902b req-1239e2c5-e063-4e80-8b8e-b273151ec25c service nova] Acquiring lock "refresh_cache-b9282eeb-09db-4138-a1f0-9e03828021b8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1058.370237] env[67899]: DEBUG oslo_concurrency.lockutils [req-b3aa5ca1-d6b9-4325-91d4-a98c10d7902b req-1239e2c5-e063-4e80-8b8e-b273151ec25c service nova] Acquired lock "refresh_cache-b9282eeb-09db-4138-a1f0-9e03828021b8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1058.370327] env[67899]: DEBUG nova.network.neutron [req-b3aa5ca1-d6b9-4325-91d4-a98c10d7902b req-1239e2c5-e063-4e80-8b8e-b273151ec25c service nova] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Refreshing network info cache for port d989bd9c-71bc-401e-951f-522fbd4539f1 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1058.723511] env[67899]: DEBUG nova.network.neutron [req-b3aa5ca1-d6b9-4325-91d4-a98c10d7902b req-1239e2c5-e063-4e80-8b8e-b273151ec25c service nova] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Updated VIF entry in instance network info cache for port d989bd9c-71bc-401e-951f-522fbd4539f1. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1058.723897] env[67899]: DEBUG nova.network.neutron [req-b3aa5ca1-d6b9-4325-91d4-a98c10d7902b req-1239e2c5-e063-4e80-8b8e-b273151ec25c service nova] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Updating instance_info_cache with network_info: [{"id": "d989bd9c-71bc-401e-951f-522fbd4539f1", "address": "fa:16:3e:26:04:cb", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd989bd9c-71", "ovs_interfaceid": "d989bd9c-71bc-401e-951f-522fbd4539f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1058.734857] env[67899]: DEBUG oslo_concurrency.lockutils [req-b3aa5ca1-d6b9-4325-91d4-a98c10d7902b req-1239e2c5-e063-4e80-8b8e-b273151ec25c service nova] Releasing lock "refresh_cache-b9282eeb-09db-4138-a1f0-9e03828021b8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1061.521099] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "6fda2654-4579-4b9a-a97c-97e0128fff14" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.521393] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "6fda2654-4579-4b9a-a97c-97e0128fff14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1068.977320] env[67899]: DEBUG oslo_concurrency.lockutils [None req-5f2bc5d9-fe94-4b02-bd3a-588599063fca tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] Acquiring lock "97ec7119-2dc8-49ac-921f-b28d04ffd056" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1068.978108] env[67899]: DEBUG oslo_concurrency.lockutils [None req-5f2bc5d9-fe94-4b02-bd3a-588599063fca tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] Lock "97ec7119-2dc8-49ac-921f-b28d04ffd056" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1069.485604] env[67899]: DEBUG oslo_concurrency.lockutils [None req-58e55a74-0ae7-4975-a668-f0c164e6d586 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] Acquiring lock "f03010e7-fd45-4959-b6fb-4c7b3fc833c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1069.485856] env[67899]: DEBUG oslo_concurrency.lockutils [None req-58e55a74-0ae7-4975-a668-f0c164e6d586 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] Lock "f03010e7-fd45-4959-b6fb-4c7b3fc833c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1070.328356] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9c89c833-8745-4a62-a174-5899420c4e70 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] Acquiring lock "3526174d-17e3-4a54-92dc-0556334ce315" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1070.328701] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9c89c833-8745-4a62-a174-5899420c4e70 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] Lock "3526174d-17e3-4a54-92dc-0556334ce315" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1072.496107] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c5ad700b-c912-48e9-ad1b-7ebfc9667984 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] Acquiring lock "cf23465f-b46c-4360-8949-2af3b9ba44c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1072.496470] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c5ad700b-c912-48e9-ad1b-7ebfc9667984 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] Lock "cf23465f-b46c-4360-8949-2af3b9ba44c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1090.127951] env[67899]: DEBUG oslo_concurrency.lockutils [None req-73aa364e-3e8b-42af-87c1-1be08a2292ea tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "2d202778-6d31-4d2f-b249-60925737da42" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1090.127951] env[67899]: DEBUG oslo_concurrency.lockutils [None req-73aa364e-3e8b-42af-87c1-1be08a2292ea tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "2d202778-6d31-4d2f-b249-60925737da42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1093.638370] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f67ac1f9-ebb2-48ce-af7a-e9c6130d953f tempest-ServerActionsV293TestJSON-1827485530 tempest-ServerActionsV293TestJSON-1827485530-project-member] Acquiring lock "5ead1ba5-49a8-41a8-b984-cb5408683a25" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1093.638370] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f67ac1f9-ebb2-48ce-af7a-e9c6130d953f tempest-ServerActionsV293TestJSON-1827485530 tempest-ServerActionsV293TestJSON-1827485530-project-member] Lock "5ead1ba5-49a8-41a8-b984-cb5408683a25" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1094.962611] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c755e67f-ad2e-43b3-b059-3c50cb0e4fec tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] Acquiring lock "8e66e2d5-aa60-474f-b77f-4a477e2d0f8e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1094.962900] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c755e67f-ad2e-43b3-b059-3c50cb0e4fec tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] Lock "8e66e2d5-aa60-474f-b77f-4a477e2d0f8e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1098.992738] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1099.996432] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1099.996703] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1099.996790] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1100.027784] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.027948] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.028998] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.029298] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.029337] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.030187] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.030563] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.030563] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.030702] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.030829] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1100.030999] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1100.031596] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1100.031717] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1100.031868] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1100.996935] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1100.997302] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1100.998334] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1101.017292] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1101.017750] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1101.017885] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1101.018044] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1101.019558] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff39a150-c26c-4e22-82da-2550d5370b77 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.029910] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84aa5e7b-6b95-4c9c-b197-7146f3c4ceaa {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.044395] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25170cda-dc33-4856-885f-0a794699363a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.051686] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3207411c-ffd8-4c8f-a062-52b444c9d277 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.082869] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180941MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1101.083043] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1101.083247] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1101.161973] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c29ae4c5-cc93-480c-8d60-96f6acba4346 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.162284] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.162528] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.162760] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.162866] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.162989] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.163126] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.163286] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.163406] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.163519] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1101.175042] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.185607] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.196112] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.206846] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9df90e1-da9a-47c3-8920-84f20ef5c588 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.216925] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4ec0f5-35d7-4ba9-bc46-47cd2a73219c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.227668] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ce0c59ed-7bb2-49cc-a158-dda0da4f88cf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.237775] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance db21b229-2664-4947-96c8-c1e92f97917e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.247937] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 94ebdda8-5b9c-4ffa-be45-571ec9ba9f81 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.258591] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 928c018d-ec75-42c6-8e55-e38bb5947bcf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.269640] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 04bee4b3-88b9-4f8c-b5d7-3955a158a2d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.279186] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a43ea307-5b84-4c8c-9f28-255980bfd51a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.289627] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.301318] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 97ec7119-2dc8-49ac-921f-b28d04ffd056 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.312538] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f03010e7-fd45-4959-b6fb-4c7b3fc833c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.322895] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3526174d-17e3-4a54-92dc-0556334ce315 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.333658] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cf23465f-b46c-4360-8949-2af3b9ba44c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.344012] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 2d202778-6d31-4d2f-b249-60925737da42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.354714] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5ead1ba5-49a8-41a8-b984-cb5408683a25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.365324] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8e66e2d5-aa60-474f-b77f-4a477e2d0f8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1101.365604] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1101.365758] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1101.715433] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5115c015-9339-469f-aa0a-d5c2ae4f9b01 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.722922] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fe3cadc-2fbe-4530-8f3a-1e74fad63c30 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.753442] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0cf2046-4706-4f77-956d-982f5818723e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.760876] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a974f35c-2293-4c59-9a46-d09805b74688 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.774947] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1101.784287] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1101.798177] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1101.798385] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.715s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1102.798469] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1102.997031] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1103.176946] env[67899]: WARNING oslo_vmware.rw_handles [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1103.176946] env[67899]: ERROR oslo_vmware.rw_handles [ 1103.176946] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/2b1923a6-263e-4c86-be16-b1e2f8087fbe/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1103.179009] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1103.179270] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Copying Virtual Disk [datastore1] vmware_temp/2b1923a6-263e-4c86-be16-b1e2f8087fbe/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/2b1923a6-263e-4c86-be16-b1e2f8087fbe/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1103.179606] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ea33a208-aa9d-464d-967a-83b7fc5f2308 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.189253] env[67899]: DEBUG oslo_vmware.api [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Waiting for the task: (returnval){ [ 1103.189253] env[67899]: value = "task-3467918" [ 1103.189253] env[67899]: _type = "Task" [ 1103.189253] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1103.197677] env[67899]: DEBUG oslo_vmware.api [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Task: {'id': task-3467918, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1103.699587] env[67899]: DEBUG oslo_vmware.exceptions [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1103.699925] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1103.700472] env[67899]: ERROR nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1103.700472] env[67899]: Faults: ['InvalidArgument'] [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Traceback (most recent call last): [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] yield resources [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] self.driver.spawn(context, instance, image_meta, [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] self._fetch_image_if_missing(context, vi) [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] image_cache(vi, tmp_image_ds_loc) [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] vm_util.copy_virtual_disk( [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] session._wait_for_task(vmdk_copy_task) [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] return self.wait_for_task(task_ref) [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] return evt.wait() [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] result = hub.switch() [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] return self.greenlet.switch() [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] self.f(*self.args, **self.kw) [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] raise exceptions.translate_fault(task_info.error) [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Faults: ['InvalidArgument'] [ 1103.700472] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] [ 1103.701519] env[67899]: INFO nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Terminating instance [ 1103.702320] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1103.702535] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1103.703152] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1103.703337] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1103.703558] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f374d311-4ac1-4e92-a021-820c0ba3a058 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.705897] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61ed714f-dc9f-4cce-a3a1-36e77ae35c64 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.713085] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1103.713311] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-028c1f8f-25b7-4b55-ae0e-fb34301026e3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.715570] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1103.715735] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1103.716672] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-be8e82ff-a76d-4f89-8d1e-ecf83efd5ade {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.721787] env[67899]: DEBUG oslo_vmware.api [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Waiting for the task: (returnval){ [ 1103.721787] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5232b882-61cc-6ddf-64c9-34e8ffb82129" [ 1103.721787] env[67899]: _type = "Task" [ 1103.721787] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1103.737805] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1103.738042] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Creating directory with path [datastore1] vmware_temp/035fc6b9-a5a7-4a61-89d4-b052780e04f4/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1103.738256] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dbe66e80-1e6b-423e-9e02-54eb03d046f9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.750203] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Created directory with path [datastore1] vmware_temp/035fc6b9-a5a7-4a61-89d4-b052780e04f4/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1103.750388] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Fetch image to [datastore1] vmware_temp/035fc6b9-a5a7-4a61-89d4-b052780e04f4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1103.750558] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/035fc6b9-a5a7-4a61-89d4-b052780e04f4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1103.751304] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f35c98c3-a568-4084-a4a8-f6db0a763fc1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.757823] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2515fdc-fc47-40f7-ad0b-761901d17060 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.766818] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af33ba92-3430-4a75-9d85-887dab4972aa {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.799285] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d43b33e-ce57-4dfc-a1e7-674596ee0c03 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.808511] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b0a16c65-7d5c-4910-a903-e18a7d272250 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.829181] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1103.860037] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1103.860311] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1103.860514] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Deleting the datastore file [datastore1] c29ae4c5-cc93-480c-8d60-96f6acba4346 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1103.860875] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-abccacc5-97ee-4fbd-b2f6-71f09c096176 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.867410] env[67899]: DEBUG oslo_vmware.api [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Waiting for the task: (returnval){ [ 1103.867410] env[67899]: value = "task-3467920" [ 1103.867410] env[67899]: _type = "Task" [ 1103.867410] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1103.876821] env[67899]: DEBUG oslo_vmware.api [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Task: {'id': task-3467920, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1103.880926] env[67899]: DEBUG oslo_vmware.rw_handles [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/035fc6b9-a5a7-4a61-89d4-b052780e04f4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1103.940778] env[67899]: DEBUG oslo_vmware.rw_handles [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1103.940978] env[67899]: DEBUG oslo_vmware.rw_handles [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/035fc6b9-a5a7-4a61-89d4-b052780e04f4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1104.379081] env[67899]: DEBUG oslo_vmware.api [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Task: {'id': task-3467920, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084976} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1104.379437] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1104.379713] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1104.379819] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1104.379969] env[67899]: INFO nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Took 0.68 seconds to destroy the instance on the hypervisor. [ 1104.382088] env[67899]: DEBUG nova.compute.claims [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1104.383161] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1104.383161] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1104.756035] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2417240-f36c-4689-bf8d-0188aba7e36d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.762726] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0940ce1-c961-45bf-b554-7457125b322d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.801287] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5a4ed8d-27ae-4242-a337-02dd28865312 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.808699] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62a8abb4-f992-417a-abb3-dbb02e3ab29b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.822465] env[67899]: DEBUG nova.compute.provider_tree [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1104.831225] env[67899]: DEBUG nova.scheduler.client.report [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1104.846995] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.464s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1104.847558] env[67899]: ERROR nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1104.847558] env[67899]: Faults: ['InvalidArgument'] [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Traceback (most recent call last): [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] self.driver.spawn(context, instance, image_meta, [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] self._fetch_image_if_missing(context, vi) [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] image_cache(vi, tmp_image_ds_loc) [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] vm_util.copy_virtual_disk( [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] session._wait_for_task(vmdk_copy_task) [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] return self.wait_for_task(task_ref) [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] return evt.wait() [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] result = hub.switch() [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] return self.greenlet.switch() [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] self.f(*self.args, **self.kw) [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] raise exceptions.translate_fault(task_info.error) [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Faults: ['InvalidArgument'] [ 1104.847558] env[67899]: ERROR nova.compute.manager [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] [ 1104.848398] env[67899]: DEBUG nova.compute.utils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1104.849643] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Build of instance c29ae4c5-cc93-480c-8d60-96f6acba4346 was re-scheduled: A specified parameter was not correct: fileType [ 1104.849643] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1104.850007] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1104.850185] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1104.850348] env[67899]: DEBUG nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1104.850510] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1105.755917] env[67899]: DEBUG nova.network.neutron [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1105.768851] env[67899]: INFO nova.compute.manager [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Took 0.92 seconds to deallocate network for instance. [ 1105.889015] env[67899]: INFO nova.scheduler.client.report [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Deleted allocations for instance c29ae4c5-cc93-480c-8d60-96f6acba4346 [ 1105.918700] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4613260e-d1e4-4cb4-8d4e-d3da4a59a597 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 480.977s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1105.920062] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 284.542s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1105.920316] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Acquiring lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1105.920521] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1105.920684] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1105.923177] env[67899]: INFO nova.compute.manager [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Terminating instance [ 1105.925673] env[67899]: DEBUG nova.compute.manager [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1105.925866] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1105.926133] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bcfb4b0a-d153-4790-b0f7-6faa9906d83d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.936715] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3c95b2e-ffba-4e1a-b4ec-f891eca062b9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1105.951505] env[67899]: DEBUG nova.compute.manager [None req-240bd615-613d-43bb-8393-08313ba6e663 tempest-ServerActionsTestJSON-1261190421 tempest-ServerActionsTestJSON-1261190421-project-member] [instance: cce79170-e329-4d7a-ab2d-fa6605068897] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1105.974701] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c29ae4c5-cc93-480c-8d60-96f6acba4346 could not be found. [ 1105.974913] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1105.975405] env[67899]: INFO nova.compute.manager [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1105.975766] env[67899]: DEBUG oslo.service.loopingcall [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1105.976042] env[67899]: DEBUG nova.compute.manager [-] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1105.976123] env[67899]: DEBUG nova.network.neutron [-] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1105.990594] env[67899]: DEBUG nova.compute.manager [None req-240bd615-613d-43bb-8393-08313ba6e663 tempest-ServerActionsTestJSON-1261190421 tempest-ServerActionsTestJSON-1261190421-project-member] [instance: cce79170-e329-4d7a-ab2d-fa6605068897] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1105.999969] env[67899]: DEBUG nova.network.neutron [-] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1106.010730] env[67899]: INFO nova.compute.manager [-] [instance: c29ae4c5-cc93-480c-8d60-96f6acba4346] Took 0.03 seconds to deallocate network for instance. [ 1106.016403] env[67899]: DEBUG oslo_concurrency.lockutils [None req-240bd615-613d-43bb-8393-08313ba6e663 tempest-ServerActionsTestJSON-1261190421 tempest-ServerActionsTestJSON-1261190421-project-member] Lock "cce79170-e329-4d7a-ab2d-fa6605068897" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 244.949s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.030820] env[67899]: DEBUG nova.compute.manager [None req-b670e61a-c4b6-468d-bb53-a643f6c18318 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 4bd3cb98-1745-4c1a-8670-9849f70eb554] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1106.063624] env[67899]: DEBUG nova.compute.manager [None req-b670e61a-c4b6-468d-bb53-a643f6c18318 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 4bd3cb98-1745-4c1a-8670-9849f70eb554] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1106.089136] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b670e61a-c4b6-468d-bb53-a643f6c18318 tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "4bd3cb98-1745-4c1a-8670-9849f70eb554" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.676s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.099328] env[67899]: DEBUG nova.compute.manager [None req-de06d2c0-f51d-4ed2-a5ca-e6993e42b706 tempest-ServerTagsTestJSON-4885625 tempest-ServerTagsTestJSON-4885625-project-member] [instance: 0bde0bc7-8f34-4941-85f0-44fe5c67e398] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1106.124893] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f8edce37-2ed9-44dd-a043-c9d900e4638c tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "c29ae4c5-cc93-480c-8d60-96f6acba4346" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.205s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.128321] env[67899]: DEBUG nova.compute.manager [None req-de06d2c0-f51d-4ed2-a5ca-e6993e42b706 tempest-ServerTagsTestJSON-4885625 tempest-ServerTagsTestJSON-4885625-project-member] [instance: 0bde0bc7-8f34-4941-85f0-44fe5c67e398] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1106.149010] env[67899]: DEBUG oslo_concurrency.lockutils [None req-de06d2c0-f51d-4ed2-a5ca-e6993e42b706 tempest-ServerTagsTestJSON-4885625 tempest-ServerTagsTestJSON-4885625-project-member] Lock "0bde0bc7-8f34-4941-85f0-44fe5c67e398" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.532s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.158782] env[67899]: DEBUG nova.compute.manager [None req-a6798107-f863-4122-88f3-719cf462c07a tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] [instance: b79e6007-10ac-4afe-a666-edef64685b22] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1106.183266] env[67899]: DEBUG nova.compute.manager [None req-a6798107-f863-4122-88f3-719cf462c07a tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] [instance: b79e6007-10ac-4afe-a666-edef64685b22] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1106.204960] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a6798107-f863-4122-88f3-719cf462c07a tempest-VolumesAdminNegativeTest-1718927231 tempest-VolumesAdminNegativeTest-1718927231-project-member] Lock "b79e6007-10ac-4afe-a666-edef64685b22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.580s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.218140] env[67899]: DEBUG nova.compute.manager [None req-7ec9887f-5049-4efc-a5b7-b56947cc8fb8 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] [instance: b9143ce6-0592-4cff-a2a1-64874734b214] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1106.243700] env[67899]: DEBUG nova.compute.manager [None req-7ec9887f-5049-4efc-a5b7-b56947cc8fb8 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] [instance: b9143ce6-0592-4cff-a2a1-64874734b214] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1106.268703] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7ec9887f-5049-4efc-a5b7-b56947cc8fb8 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] Lock "b9143ce6-0592-4cff-a2a1-64874734b214" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.803s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.278204] env[67899]: DEBUG nova.compute.manager [None req-66ac858c-b6a2-41dc-8464-b4d137e56bbf tempest-ServersTestJSON-1082368249 tempest-ServersTestJSON-1082368249-project-member] [instance: 868ae015-d365-4a42-8f5d-72faa796fa37] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1106.303693] env[67899]: DEBUG nova.compute.manager [None req-66ac858c-b6a2-41dc-8464-b4d137e56bbf tempest-ServersTestJSON-1082368249 tempest-ServersTestJSON-1082368249-project-member] [instance: 868ae015-d365-4a42-8f5d-72faa796fa37] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1106.323092] env[67899]: DEBUG oslo_concurrency.lockutils [None req-66ac858c-b6a2-41dc-8464-b4d137e56bbf tempest-ServersTestJSON-1082368249 tempest-ServersTestJSON-1082368249-project-member] Lock "868ae015-d365-4a42-8f5d-72faa796fa37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.764s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.332034] env[67899]: DEBUG nova.compute.manager [None req-bd38433a-6510-4fab-ad5c-04a8ac1f2888 tempest-ServerRescueTestJSONUnderV235-832255538 tempest-ServerRescueTestJSONUnderV235-832255538-project-member] [instance: a31cf212-7d4e-4f1c-b494-6b9739b2ef95] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1106.354500] env[67899]: DEBUG nova.compute.manager [None req-bd38433a-6510-4fab-ad5c-04a8ac1f2888 tempest-ServerRescueTestJSONUnderV235-832255538 tempest-ServerRescueTestJSONUnderV235-832255538-project-member] [instance: a31cf212-7d4e-4f1c-b494-6b9739b2ef95] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1106.377627] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bd38433a-6510-4fab-ad5c-04a8ac1f2888 tempest-ServerRescueTestJSONUnderV235-832255538 tempest-ServerRescueTestJSONUnderV235-832255538-project-member] Lock "a31cf212-7d4e-4f1c-b494-6b9739b2ef95" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.698s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.389119] env[67899]: DEBUG nova.compute.manager [None req-f87ab63a-06ea-4a52-b5fe-661a784a2b15 tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] [instance: cb9d29cb-20ee-4875-b993-49cafed344d4] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1106.416105] env[67899]: DEBUG nova.compute.manager [None req-f87ab63a-06ea-4a52-b5fe-661a784a2b15 tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] [instance: cb9d29cb-20ee-4875-b993-49cafed344d4] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1106.438976] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f87ab63a-06ea-4a52-b5fe-661a784a2b15 tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] Lock "cb9d29cb-20ee-4875-b993-49cafed344d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.836s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.447603] env[67899]: DEBUG nova.compute.manager [None req-f35050ea-7d61-4666-8771-22e4ef0303e0 tempest-ServersNegativeTestMultiTenantJSON-1576151238 tempest-ServersNegativeTestMultiTenantJSON-1576151238-project-member] [instance: eb285233-ef68-4426-827f-3320abe98cac] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1106.471954] env[67899]: DEBUG nova.compute.manager [None req-f35050ea-7d61-4666-8771-22e4ef0303e0 tempest-ServersNegativeTestMultiTenantJSON-1576151238 tempest-ServersNegativeTestMultiTenantJSON-1576151238-project-member] [instance: eb285233-ef68-4426-827f-3320abe98cac] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1106.492754] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f35050ea-7d61-4666-8771-22e4ef0303e0 tempest-ServersNegativeTestMultiTenantJSON-1576151238 tempest-ServersNegativeTestMultiTenantJSON-1576151238-project-member] Lock "eb285233-ef68-4426-827f-3320abe98cac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.176s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.500704] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1106.548402] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1106.548656] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1106.550124] env[67899]: INFO nova.compute.claims [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1106.909850] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3689b808-c043-4c77-8b3b-a204f4545f64 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.917735] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbbb5373-6e52-4a29-ad84-e61e4103dcd0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.947074] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8db6dffa-d590-4dc5-a740-a495d894bbb7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.954288] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88aa665c-117c-45a8-8548-184a052a4fbd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.966917] env[67899]: DEBUG nova.compute.provider_tree [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1106.975405] env[67899]: DEBUG nova.scheduler.client.report [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1106.990402] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.441s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1106.990402] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1107.024080] env[67899]: DEBUG nova.compute.utils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1107.025117] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1107.025292] env[67899]: DEBUG nova.network.neutron [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1107.033872] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1107.082824] env[67899]: DEBUG nova.policy [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '71ad1a1e89c3445fad1e9ba32079866a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '59dc0a4aca434585a911e3a8db2368ed', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1107.096690] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1107.124101] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1107.124383] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1107.124514] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1107.124697] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1107.124841] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1107.124984] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1107.125598] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1107.125790] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1107.125957] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1107.126137] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1107.126306] env[67899]: DEBUG nova.virt.hardware [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1107.127300] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7b222a4-0e9f-4580-903f-df5e3fa70fd4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.135836] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d445899-00ef-4666-a3de-66d6ea859ca8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.440941] env[67899]: DEBUG nova.network.neutron [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Successfully created port: a5589e22-a642-4bd5-bc88-8caf84a1b679 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1108.083919] env[67899]: DEBUG nova.network.neutron [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Successfully updated port: a5589e22-a642-4bd5-bc88-8caf84a1b679 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1108.098475] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquiring lock "refresh_cache-6c4977f7-c53d-4c96-9028-86d7561f0d0d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1108.098652] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquired lock "refresh_cache-6c4977f7-c53d-4c96-9028-86d7561f0d0d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1108.098803] env[67899]: DEBUG nova.network.neutron [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1108.141016] env[67899]: DEBUG nova.network.neutron [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1108.365599] env[67899]: DEBUG nova.network.neutron [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Updating instance_info_cache with network_info: [{"id": "a5589e22-a642-4bd5-bc88-8caf84a1b679", "address": "fa:16:3e:be:74:f1", "network": {"id": "6ea17e40-f928-4b50-8bf9-ad7a1e1906b5", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1625421386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "59dc0a4aca434585a911e3a8db2368ed", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc0e97b-b21d-4557-a4d4-fd7e8f973368", "external-id": "nsx-vlan-transportzone-380", "segmentation_id": 380, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa5589e22-a6", "ovs_interfaceid": "a5589e22-a642-4bd5-bc88-8caf84a1b679", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1108.379836] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Releasing lock "refresh_cache-6c4977f7-c53d-4c96-9028-86d7561f0d0d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1108.380208] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Instance network_info: |[{"id": "a5589e22-a642-4bd5-bc88-8caf84a1b679", "address": "fa:16:3e:be:74:f1", "network": {"id": "6ea17e40-f928-4b50-8bf9-ad7a1e1906b5", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1625421386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "59dc0a4aca434585a911e3a8db2368ed", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc0e97b-b21d-4557-a4d4-fd7e8f973368", "external-id": "nsx-vlan-transportzone-380", "segmentation_id": 380, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa5589e22-a6", "ovs_interfaceid": "a5589e22-a642-4bd5-bc88-8caf84a1b679", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1108.380671] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:be:74:f1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ccc0e97b-b21d-4557-a4d4-fd7e8f973368', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a5589e22-a642-4bd5-bc88-8caf84a1b679', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1108.388441] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Creating folder: Project (59dc0a4aca434585a911e3a8db2368ed). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1108.389057] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0a7eef03-0082-43b7-b3ba-009bc9e70276 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.400940] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Created folder: Project (59dc0a4aca434585a911e3a8db2368ed) in parent group-v692900. [ 1108.401161] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Creating folder: Instances. Parent ref: group-v692966. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1108.401402] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1f772729-2687-43ba-b107-685029f555c9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.410489] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Created folder: Instances in parent group-v692966. [ 1108.410756] env[67899]: DEBUG oslo.service.loopingcall [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1108.411034] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1108.411198] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1650ae28-c685-4437-b6c1-7246f25c58e2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.430017] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1108.430017] env[67899]: value = "task-3467923" [ 1108.430017] env[67899]: _type = "Task" [ 1108.430017] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1108.437646] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467923, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1108.940208] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467923, 'name': CreateVM_Task, 'duration_secs': 0.281812} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1108.940394] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1108.941043] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1108.941206] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1108.941523] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1108.941771] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2a0192f7-2d49-4445-abce-e17f1af3ac9b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.946158] env[67899]: DEBUG oslo_vmware.api [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Waiting for the task: (returnval){ [ 1108.946158] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]521397a2-afc9-e621-15ac-79d9d7c8b0b0" [ 1108.946158] env[67899]: _type = "Task" [ 1108.946158] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1108.953462] env[67899]: DEBUG oslo_vmware.api [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]521397a2-afc9-e621-15ac-79d9d7c8b0b0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1109.305348] env[67899]: DEBUG nova.compute.manager [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Received event network-vif-plugged-a5589e22-a642-4bd5-bc88-8caf84a1b679 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1109.305622] env[67899]: DEBUG oslo_concurrency.lockutils [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] Acquiring lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1109.305832] env[67899]: DEBUG oslo_concurrency.lockutils [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1109.305918] env[67899]: DEBUG oslo_concurrency.lockutils [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1109.306108] env[67899]: DEBUG nova.compute.manager [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] No waiting events found dispatching network-vif-plugged-a5589e22-a642-4bd5-bc88-8caf84a1b679 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1109.306273] env[67899]: WARNING nova.compute.manager [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Received unexpected event network-vif-plugged-a5589e22-a642-4bd5-bc88-8caf84a1b679 for instance with vm_state building and task_state spawning. [ 1109.306429] env[67899]: DEBUG nova.compute.manager [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Received event network-changed-a5589e22-a642-4bd5-bc88-8caf84a1b679 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1109.306579] env[67899]: DEBUG nova.compute.manager [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Refreshing instance network info cache due to event network-changed-a5589e22-a642-4bd5-bc88-8caf84a1b679. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1109.306755] env[67899]: DEBUG oslo_concurrency.lockutils [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] Acquiring lock "refresh_cache-6c4977f7-c53d-4c96-9028-86d7561f0d0d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1109.306886] env[67899]: DEBUG oslo_concurrency.lockutils [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] Acquired lock "refresh_cache-6c4977f7-c53d-4c96-9028-86d7561f0d0d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1109.307047] env[67899]: DEBUG nova.network.neutron [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Refreshing network info cache for port a5589e22-a642-4bd5-bc88-8caf84a1b679 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1109.456728] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1109.457274] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1109.457274] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1109.566559] env[67899]: DEBUG nova.network.neutron [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Updated VIF entry in instance network info cache for port a5589e22-a642-4bd5-bc88-8caf84a1b679. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1109.566949] env[67899]: DEBUG nova.network.neutron [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Updating instance_info_cache with network_info: [{"id": "a5589e22-a642-4bd5-bc88-8caf84a1b679", "address": "fa:16:3e:be:74:f1", "network": {"id": "6ea17e40-f928-4b50-8bf9-ad7a1e1906b5", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1625421386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "59dc0a4aca434585a911e3a8db2368ed", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc0e97b-b21d-4557-a4d4-fd7e8f973368", "external-id": "nsx-vlan-transportzone-380", "segmentation_id": 380, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa5589e22-a6", "ovs_interfaceid": "a5589e22-a642-4bd5-bc88-8caf84a1b679", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1109.577598] env[67899]: DEBUG oslo_concurrency.lockutils [req-18f9052e-45fd-4f97-9620-87303870da6f req-deb71de7-f820-43aa-b789-2906c08c2445 service nova] Releasing lock "refresh_cache-6c4977f7-c53d-4c96-9028-86d7561f0d0d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1114.113840] env[67899]: DEBUG oslo_concurrency.lockutils [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquiring lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1151.447020] env[67899]: WARNING oslo_vmware.rw_handles [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1151.447020] env[67899]: ERROR oslo_vmware.rw_handles [ 1151.447743] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/035fc6b9-a5a7-4a61-89d4-b052780e04f4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1151.449516] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1151.449761] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Copying Virtual Disk [datastore1] vmware_temp/035fc6b9-a5a7-4a61-89d4-b052780e04f4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/035fc6b9-a5a7-4a61-89d4-b052780e04f4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1151.450078] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bbb850fa-6eb4-4726-8152-476f060b0e43 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.459563] env[67899]: DEBUG oslo_vmware.api [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Waiting for the task: (returnval){ [ 1151.459563] env[67899]: value = "task-3467924" [ 1151.459563] env[67899]: _type = "Task" [ 1151.459563] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1151.467482] env[67899]: DEBUG oslo_vmware.api [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Task: {'id': task-3467924, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1151.970152] env[67899]: DEBUG oslo_vmware.exceptions [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1151.970386] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1151.970935] env[67899]: ERROR nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1151.970935] env[67899]: Faults: ['InvalidArgument'] [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Traceback (most recent call last): [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] yield resources [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] self.driver.spawn(context, instance, image_meta, [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] self._fetch_image_if_missing(context, vi) [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] image_cache(vi, tmp_image_ds_loc) [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] vm_util.copy_virtual_disk( [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] session._wait_for_task(vmdk_copy_task) [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] return self.wait_for_task(task_ref) [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] return evt.wait() [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] result = hub.switch() [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] return self.greenlet.switch() [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] self.f(*self.args, **self.kw) [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] raise exceptions.translate_fault(task_info.error) [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Faults: ['InvalidArgument'] [ 1151.970935] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] [ 1151.971963] env[67899]: INFO nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Terminating instance [ 1151.973393] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1151.973393] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1151.973393] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ea11c753-79ef-4c63-9ebc-972e7a44420b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.975584] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1151.975767] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1151.976513] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8159cf8-f8c1-4c7e-b62c-c28f23a02a01 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.982902] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1151.983131] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e66193db-c70a-4a3b-a106-477cee20f800 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.985282] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1151.985449] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1151.986369] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-09d186c3-c304-450e-a1b5-57e1c569dc82 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.990918] env[67899]: DEBUG oslo_vmware.api [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Waiting for the task: (returnval){ [ 1151.990918] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52110775-12c9-0e8f-32ae-82bc55dd147f" [ 1151.990918] env[67899]: _type = "Task" [ 1151.990918] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1151.996106] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1152.002035] env[67899]: DEBUG oslo_vmware.api [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52110775-12c9-0e8f-32ae-82bc55dd147f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1152.051963] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1152.052212] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1152.052397] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Deleting the datastore file [datastore1] 793d6f98-ed1b-4a78-bcd5-cb796441d64b {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1152.052652] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d90facff-2e53-41e7-9b49-05d32db451f1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.058971] env[67899]: DEBUG oslo_vmware.api [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Waiting for the task: (returnval){ [ 1152.058971] env[67899]: value = "task-3467926" [ 1152.058971] env[67899]: _type = "Task" [ 1152.058971] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1152.066539] env[67899]: DEBUG oslo_vmware.api [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Task: {'id': task-3467926, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1152.500726] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1152.501032] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Creating directory with path [datastore1] vmware_temp/72696996-e193-4217-bcb3-4d0f5a70c1cf/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1152.501226] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4f5678d4-c719-4358-9b98-8c2d3c723365 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.511830] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Created directory with path [datastore1] vmware_temp/72696996-e193-4217-bcb3-4d0f5a70c1cf/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1152.512015] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Fetch image to [datastore1] vmware_temp/72696996-e193-4217-bcb3-4d0f5a70c1cf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1152.512202] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/72696996-e193-4217-bcb3-4d0f5a70c1cf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1152.512883] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35535173-5636-4bc6-89d5-e63c1eef7426 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.519014] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d8ce2f1-a521-4d9b-a7ee-83c9c87f0188 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.529234] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9c218a3-73de-4809-92e8-d3e49a0b9eea {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.563549] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3100c3fb-4abc-4cde-ad9f-f600b95dfaac {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.570747] env[67899]: DEBUG oslo_vmware.api [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Task: {'id': task-3467926, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073154} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1152.572171] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1152.572366] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1152.572536] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1152.572704] env[67899]: INFO nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1152.574450] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0a70aa49-c920-49e5-ba88-0f3f7ad9c583 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.576330] env[67899]: DEBUG nova.compute.claims [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1152.576504] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1152.576712] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1152.597568] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1152.663197] env[67899]: DEBUG oslo_vmware.rw_handles [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72696996-e193-4217-bcb3-4d0f5a70c1cf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1152.724831] env[67899]: DEBUG oslo_vmware.rw_handles [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1152.725022] env[67899]: DEBUG oslo_vmware.rw_handles [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72696996-e193-4217-bcb3-4d0f5a70c1cf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1153.075991] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae04d29c-2597-4112-93c8-521061f4a48f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.083634] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2947ba8-9425-47a2-8b5a-981ec34f36d5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.114250] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a93b39ba-c7e3-4cdb-b8ea-0443569693ba {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.121260] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48572a35-846f-44f0-865d-5b04f3ac7dd9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.134287] env[67899]: DEBUG nova.compute.provider_tree [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1153.142429] env[67899]: DEBUG nova.scheduler.client.report [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1153.158576] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.582s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1153.159213] env[67899]: ERROR nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1153.159213] env[67899]: Faults: ['InvalidArgument'] [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Traceback (most recent call last): [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] self.driver.spawn(context, instance, image_meta, [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] self._fetch_image_if_missing(context, vi) [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] image_cache(vi, tmp_image_ds_loc) [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] vm_util.copy_virtual_disk( [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] session._wait_for_task(vmdk_copy_task) [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] return self.wait_for_task(task_ref) [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] return evt.wait() [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] result = hub.switch() [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] return self.greenlet.switch() [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] self.f(*self.args, **self.kw) [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] raise exceptions.translate_fault(task_info.error) [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Faults: ['InvalidArgument'] [ 1153.159213] env[67899]: ERROR nova.compute.manager [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] [ 1153.160295] env[67899]: DEBUG nova.compute.utils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1153.161196] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Build of instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b was re-scheduled: A specified parameter was not correct: fileType [ 1153.161196] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1153.161565] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1153.161735] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1153.161904] env[67899]: DEBUG nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1153.162078] env[67899]: DEBUG nova.network.neutron [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1153.550506] env[67899]: DEBUG nova.network.neutron [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1153.561591] env[67899]: INFO nova.compute.manager [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Took 0.40 seconds to deallocate network for instance. [ 1153.656735] env[67899]: INFO nova.scheduler.client.report [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Deleted allocations for instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b [ 1153.678610] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c90bdb57-6703-4d5e-a52f-aee9c0fb7dc8 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 523.667s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1153.679830] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 324.830s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1153.680084] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Acquiring lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1153.680299] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1153.680532] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1153.682558] env[67899]: INFO nova.compute.manager [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Terminating instance [ 1153.684278] env[67899]: DEBUG nova.compute.manager [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1153.684473] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1153.684953] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b06c9083-d7a1-4b68-9516-442177bcc344 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.694160] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd305ada-ba49-4852-9460-608fe0f2f55f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.706977] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1153.729083] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 793d6f98-ed1b-4a78-bcd5-cb796441d64b could not be found. [ 1153.729308] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1153.729487] env[67899]: INFO nova.compute.manager [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1153.729727] env[67899]: DEBUG oslo.service.loopingcall [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1153.729941] env[67899]: DEBUG nova.compute.manager [-] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1153.730053] env[67899]: DEBUG nova.network.neutron [-] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1153.756399] env[67899]: DEBUG nova.network.neutron [-] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1153.757940] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1153.758197] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1153.759621] env[67899]: INFO nova.compute.claims [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1153.764011] env[67899]: INFO nova.compute.manager [-] [instance: 793d6f98-ed1b-4a78-bcd5-cb796441d64b] Took 0.03 seconds to deallocate network for instance. [ 1153.857166] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7e85cda-30d5-4fef-98a3-bf4a55ca3836 tempest-FloatingIPsAssociationNegativeTestJSON-531317357 tempest-FloatingIPsAssociationNegativeTestJSON-531317357-project-member] Lock "793d6f98-ed1b-4a78-bcd5-cb796441d64b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.176s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1154.104210] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d14aa611-74cc-4582-8ff2-abe1ca8eb1ce {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.112129] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b0d166f-0523-4e20-887a-270b1ea05c8a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.141301] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24b16862-0df4-406e-b66b-5ffec70827ac {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.148016] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d652300f-7ac2-4eb7-8562-8fa98a8f36d6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.160635] env[67899]: DEBUG nova.compute.provider_tree [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1154.169282] env[67899]: DEBUG nova.scheduler.client.report [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1154.183807] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.425s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1154.184314] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1154.217075] env[67899]: DEBUG nova.compute.utils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1154.218759] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1154.218759] env[67899]: DEBUG nova.network.neutron [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1154.227589] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1154.296298] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1154.319437] env[67899]: DEBUG nova.policy [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aa0b7ae2baa0492ca3df7a355556fd6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26b28c3471e244298122a1e83eb9e4e4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1154.322727] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1154.322959] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1154.323131] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1154.323330] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1154.323472] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1154.323612] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1154.323812] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1154.323968] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1154.324149] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1154.324312] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1154.324482] env[67899]: DEBUG nova.virt.hardware [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1154.325657] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf92cd32-5334-4363-b5df-b820b9753dfc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.333692] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62c172c4-e8c8-4a20-b8af-4dd8d48e6d10 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.726276] env[67899]: DEBUG nova.network.neutron [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Successfully created port: 05e08822-3676-4006-8a17-50e942df7e19 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1155.643346] env[67899]: DEBUG nova.compute.manager [req-4e0cc8a1-dfb7-4396-965b-bc3ba1e267b7 req-4ffc22cc-d362-4ba4-8447-60cdbcbc327d service nova] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Received event network-vif-plugged-05e08822-3676-4006-8a17-50e942df7e19 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1155.643531] env[67899]: DEBUG oslo_concurrency.lockutils [req-4e0cc8a1-dfb7-4396-965b-bc3ba1e267b7 req-4ffc22cc-d362-4ba4-8447-60cdbcbc327d service nova] Acquiring lock "ec826735-4cc4-4847-8750-c5480e62134a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1155.643746] env[67899]: DEBUG oslo_concurrency.lockutils [req-4e0cc8a1-dfb7-4396-965b-bc3ba1e267b7 req-4ffc22cc-d362-4ba4-8447-60cdbcbc327d service nova] Lock "ec826735-4cc4-4847-8750-c5480e62134a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1155.643945] env[67899]: DEBUG oslo_concurrency.lockutils [req-4e0cc8a1-dfb7-4396-965b-bc3ba1e267b7 req-4ffc22cc-d362-4ba4-8447-60cdbcbc327d service nova] Lock "ec826735-4cc4-4847-8750-c5480e62134a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1155.644087] env[67899]: DEBUG nova.compute.manager [req-4e0cc8a1-dfb7-4396-965b-bc3ba1e267b7 req-4ffc22cc-d362-4ba4-8447-60cdbcbc327d service nova] [instance: ec826735-4cc4-4847-8750-c5480e62134a] No waiting events found dispatching network-vif-plugged-05e08822-3676-4006-8a17-50e942df7e19 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1155.644265] env[67899]: WARNING nova.compute.manager [req-4e0cc8a1-dfb7-4396-965b-bc3ba1e267b7 req-4ffc22cc-d362-4ba4-8447-60cdbcbc327d service nova] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Received unexpected event network-vif-plugged-05e08822-3676-4006-8a17-50e942df7e19 for instance with vm_state building and task_state spawning. [ 1155.645524] env[67899]: DEBUG nova.network.neutron [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Successfully updated port: 05e08822-3676-4006-8a17-50e942df7e19 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1155.656472] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquiring lock "refresh_cache-ec826735-4cc4-4847-8750-c5480e62134a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1155.656619] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquired lock "refresh_cache-ec826735-4cc4-4847-8750-c5480e62134a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1155.656768] env[67899]: DEBUG nova.network.neutron [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1155.710927] env[67899]: DEBUG nova.network.neutron [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1155.935337] env[67899]: DEBUG nova.network.neutron [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Updating instance_info_cache with network_info: [{"id": "05e08822-3676-4006-8a17-50e942df7e19", "address": "fa:16:3e:fb:36:e2", "network": {"id": "200fac0c-e6f9-4cf8-aee3-b71b7f847e2f", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-17981855-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "26b28c3471e244298122a1e83eb9e4e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3a80436-f7a9-431a-acec-aca3d76e3f9b", "external-id": "cl2-zone-339", "segmentation_id": 339, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05e08822-36", "ovs_interfaceid": "05e08822-3676-4006-8a17-50e942df7e19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1155.949930] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Releasing lock "refresh_cache-ec826735-4cc4-4847-8750-c5480e62134a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1155.950244] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Instance network_info: |[{"id": "05e08822-3676-4006-8a17-50e942df7e19", "address": "fa:16:3e:fb:36:e2", "network": {"id": "200fac0c-e6f9-4cf8-aee3-b71b7f847e2f", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-17981855-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "26b28c3471e244298122a1e83eb9e4e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3a80436-f7a9-431a-acec-aca3d76e3f9b", "external-id": "cl2-zone-339", "segmentation_id": 339, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05e08822-36", "ovs_interfaceid": "05e08822-3676-4006-8a17-50e942df7e19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1155.950630] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fb:36:e2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3a80436-f7a9-431a-acec-aca3d76e3f9b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '05e08822-3676-4006-8a17-50e942df7e19', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1155.958239] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Creating folder: Project (26b28c3471e244298122a1e83eb9e4e4). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1155.958776] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6a2ce0d6-f4bd-459c-a099-e3e6e56b1864 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.969988] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Created folder: Project (26b28c3471e244298122a1e83eb9e4e4) in parent group-v692900. [ 1155.970180] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Creating folder: Instances. Parent ref: group-v692969. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1155.970399] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d192329f-75d9-4960-af56-4e529b5206b1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.978897] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Created folder: Instances in parent group-v692969. [ 1155.979136] env[67899]: DEBUG oslo.service.loopingcall [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1155.979313] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1155.979500] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-42b2b950-9dae-4243-886f-00598712f84e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.998414] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1155.998414] env[67899]: value = "task-3467929" [ 1155.998414] env[67899]: _type = "Task" [ 1155.998414] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1156.005527] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467929, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1156.507288] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467929, 'name': CreateVM_Task, 'duration_secs': 0.279411} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1156.507403] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1156.508094] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1156.508264] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1156.508568] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1156.508800] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6573701e-1edb-4c61-95f5-f12e64834017 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.512834] env[67899]: DEBUG oslo_vmware.api [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Waiting for the task: (returnval){ [ 1156.512834] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52f5efb1-101f-83b3-83f4-73462f24764d" [ 1156.512834] env[67899]: _type = "Task" [ 1156.512834] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1156.519719] env[67899]: DEBUG oslo_vmware.api [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52f5efb1-101f-83b3-83f4-73462f24764d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1157.023014] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1157.023379] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1157.023495] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1157.670130] env[67899]: DEBUG nova.compute.manager [req-be6bbfa3-ec2a-4fbb-9e5a-22ecdd814a52 req-38742573-a3ac-480b-9d51-0496c52f7709 service nova] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Received event network-changed-05e08822-3676-4006-8a17-50e942df7e19 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1157.670350] env[67899]: DEBUG nova.compute.manager [req-be6bbfa3-ec2a-4fbb-9e5a-22ecdd814a52 req-38742573-a3ac-480b-9d51-0496c52f7709 service nova] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Refreshing instance network info cache due to event network-changed-05e08822-3676-4006-8a17-50e942df7e19. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1157.670546] env[67899]: DEBUG oslo_concurrency.lockutils [req-be6bbfa3-ec2a-4fbb-9e5a-22ecdd814a52 req-38742573-a3ac-480b-9d51-0496c52f7709 service nova] Acquiring lock "refresh_cache-ec826735-4cc4-4847-8750-c5480e62134a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1157.670687] env[67899]: DEBUG oslo_concurrency.lockutils [req-be6bbfa3-ec2a-4fbb-9e5a-22ecdd814a52 req-38742573-a3ac-480b-9d51-0496c52f7709 service nova] Acquired lock "refresh_cache-ec826735-4cc4-4847-8750-c5480e62134a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1157.670845] env[67899]: DEBUG nova.network.neutron [req-be6bbfa3-ec2a-4fbb-9e5a-22ecdd814a52 req-38742573-a3ac-480b-9d51-0496c52f7709 service nova] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Refreshing network info cache for port 05e08822-3676-4006-8a17-50e942df7e19 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1157.951949] env[67899]: DEBUG nova.network.neutron [req-be6bbfa3-ec2a-4fbb-9e5a-22ecdd814a52 req-38742573-a3ac-480b-9d51-0496c52f7709 service nova] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Updated VIF entry in instance network info cache for port 05e08822-3676-4006-8a17-50e942df7e19. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1157.952338] env[67899]: DEBUG nova.network.neutron [req-be6bbfa3-ec2a-4fbb-9e5a-22ecdd814a52 req-38742573-a3ac-480b-9d51-0496c52f7709 service nova] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Updating instance_info_cache with network_info: [{"id": "05e08822-3676-4006-8a17-50e942df7e19", "address": "fa:16:3e:fb:36:e2", "network": {"id": "200fac0c-e6f9-4cf8-aee3-b71b7f847e2f", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-17981855-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "26b28c3471e244298122a1e83eb9e4e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3a80436-f7a9-431a-acec-aca3d76e3f9b", "external-id": "cl2-zone-339", "segmentation_id": 339, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05e08822-36", "ovs_interfaceid": "05e08822-3676-4006-8a17-50e942df7e19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1157.961896] env[67899]: DEBUG oslo_concurrency.lockutils [req-be6bbfa3-ec2a-4fbb-9e5a-22ecdd814a52 req-38742573-a3ac-480b-9d51-0496c52f7709 service nova] Releasing lock "refresh_cache-ec826735-4cc4-4847-8750-c5480e62134a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1158.999049] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1158.999049] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1158.999049] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances with incomplete migration {{(pid=67899) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1160.006663] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1160.006953] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1160.995585] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1160.995760] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1160.995914] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1161.017075] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.017383] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.017383] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.017513] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.017644] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.017801] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.017926] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.018060] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.018185] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.018295] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1161.018415] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1161.018861] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1161.956655] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquiring lock "ec826735-4cc4-4847-8750-c5480e62134a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1161.995911] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1161.996109] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1162.991836] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1163.015336] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1163.015579] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1163.030840] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1163.031766] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1163.031766] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1163.031766] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1163.032480] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a51c379-bf81-41bd-9076-2a31fc37969e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.041617] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28cf7266-d528-410b-bc0e-2990665c7e41 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.056992] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73dc45fa-c7b8-4bb6-afa7-8eb885d4028d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.063524] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3688b1c4-c3bd-4ee0-b179-c032fb601431 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.097434] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180939MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1163.097434] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1163.097434] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1163.268933] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.269116] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.269248] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.269372] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.269498] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.269617] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.269732] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.269849] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.269964] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.270112] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1163.288404] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.299024] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9df90e1-da9a-47c3-8920-84f20ef5c588 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.314809] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4ec0f5-35d7-4ba9-bc46-47cd2a73219c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.328856] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ce0c59ed-7bb2-49cc-a158-dda0da4f88cf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.340211] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance db21b229-2664-4947-96c8-c1e92f97917e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.350896] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 94ebdda8-5b9c-4ffa-be45-571ec9ba9f81 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.361770] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 928c018d-ec75-42c6-8e55-e38bb5947bcf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.371641] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 04bee4b3-88b9-4f8c-b5d7-3955a158a2d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.387616] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a43ea307-5b84-4c8c-9f28-255980bfd51a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.401084] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.411314] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 97ec7119-2dc8-49ac-921f-b28d04ffd056 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.444356] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f03010e7-fd45-4959-b6fb-4c7b3fc833c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.461722] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3526174d-17e3-4a54-92dc-0556334ce315 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.474306] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cf23465f-b46c-4360-8949-2af3b9ba44c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.487640] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 2d202778-6d31-4d2f-b249-60925737da42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.501840] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5ead1ba5-49a8-41a8-b984-cb5408683a25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.513749] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8e66e2d5-aa60-474f-b77f-4a477e2d0f8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1163.514027] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1163.514183] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1163.537278] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing inventories for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1163.554336] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating ProviderTree inventory for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1163.554533] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating inventory in ProviderTree for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1163.566381] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing aggregate associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, aggregates: None {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1163.584568] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing trait associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, traits: COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1163.912152] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-555c98ce-894a-4bc6-a5eb-445a471d6dab {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.919542] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ac19918-1efa-47d3-b3e4-e8ad734b3648 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.949765] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f30f606-fc12-4417-810b-6d223501f75a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.957433] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bc8f04c-82ae-42d6-9a88-97092f3bf38e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.970560] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1163.979566] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1163.993356] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1163.993720] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.897s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1164.974256] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1164.996071] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1164.996342] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1165.009576] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] There are 0 instances to clean {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1168.347908] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquiring lock "7a82e877-8a39-4684-8b75-711b7bedddac" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1168.348223] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "7a82e877-8a39-4684-8b75-711b7bedddac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1174.976160] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_power_states {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1174.998266] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Getting list of instances from cluster (obj){ [ 1174.998266] env[67899]: value = "domain-c8" [ 1174.998266] env[67899]: _type = "ClusterComputeResource" [ 1174.998266] env[67899]: } {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1174.999560] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c92d89d4-6e7f-4445-a6e5-94beacb5173a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.017020] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Got total of 10 instances {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1175.017209] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.017400] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 4458efe7-18d4-4cfb-b131-e09d36124d68 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.017558] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.017714] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 8d2a9e20-82d3-44cf-a725-59804debe1cc {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.017894] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid bb97988e-9f7f-4e4f-9904-fc560d0912ee {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.018029] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 37ab08db-50ab-4c30-9e18-05007c5d1c27 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.018183] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.018331] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid b9282eeb-09db-4138-a1f0-9e03828021b8 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.018478] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 6c4977f7-c53d-4c96-9028-86d7561f0d0d {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.018758] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid ec826735-4cc4-4847-8750-c5480e62134a {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1175.018938] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.019184] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "4458efe7-18d4-4cfb-b131-e09d36124d68" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.019382] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.019577] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "8d2a9e20-82d3-44cf-a725-59804debe1cc" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.019770] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.019958] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "37ab08db-50ab-4c30-9e18-05007c5d1c27" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.020621] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.020621] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "b9282eeb-09db-4138-a1f0-9e03828021b8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.020621] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.020789] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "ec826735-4cc4-4847-8750-c5480e62134a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1201.467058] env[67899]: WARNING oslo_vmware.rw_handles [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1201.467058] env[67899]: ERROR oslo_vmware.rw_handles [ 1201.467779] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/72696996-e193-4217-bcb3-4d0f5a70c1cf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1201.469886] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1201.470210] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Copying Virtual Disk [datastore1] vmware_temp/72696996-e193-4217-bcb3-4d0f5a70c1cf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/72696996-e193-4217-bcb3-4d0f5a70c1cf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1201.470556] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-383d85b2-ba55-4bd8-95f0-f02017060677 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1201.478205] env[67899]: DEBUG oslo_vmware.api [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Waiting for the task: (returnval){ [ 1201.478205] env[67899]: value = "task-3467930" [ 1201.478205] env[67899]: _type = "Task" [ 1201.478205] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1201.486294] env[67899]: DEBUG oslo_vmware.api [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Task: {'id': task-3467930, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1201.988713] env[67899]: DEBUG oslo_vmware.exceptions [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1201.988944] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1201.989507] env[67899]: ERROR nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1201.989507] env[67899]: Faults: ['InvalidArgument'] [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Traceback (most recent call last): [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] yield resources [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] self.driver.spawn(context, instance, image_meta, [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] self._fetch_image_if_missing(context, vi) [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] image_cache(vi, tmp_image_ds_loc) [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] vm_util.copy_virtual_disk( [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] session._wait_for_task(vmdk_copy_task) [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] return self.wait_for_task(task_ref) [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] return evt.wait() [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] result = hub.switch() [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] return self.greenlet.switch() [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] self.f(*self.args, **self.kw) [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] raise exceptions.translate_fault(task_info.error) [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Faults: ['InvalidArgument'] [ 1201.989507] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] [ 1201.990665] env[67899]: INFO nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Terminating instance [ 1201.991305] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1201.991526] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1201.991755] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-640f1132-b584-409d-90f4-ce0cbae15def {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1201.993785] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "refresh_cache-e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1201.993946] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquired lock "refresh_cache-e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1201.994132] env[67899]: DEBUG nova.network.neutron [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1202.001046] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1202.001046] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1202.001711] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2be9af1c-67a5-4365-aea1-756c358baf27 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.009219] env[67899]: DEBUG oslo_vmware.api [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 1202.009219] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5272908f-908d-5499-9dd1-242c225424fe" [ 1202.009219] env[67899]: _type = "Task" [ 1202.009219] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1202.016404] env[67899]: DEBUG oslo_vmware.api [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5272908f-908d-5499-9dd1-242c225424fe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1202.024641] env[67899]: DEBUG nova.network.neutron [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1202.082459] env[67899]: DEBUG nova.network.neutron [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1202.091320] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Releasing lock "refresh_cache-e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1202.091693] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1202.091883] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1202.092919] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ca6fa90-627c-4f72-936a-4ec09cc06aca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.100263] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1202.100477] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fdaff1c9-e158-48db-8583-97cbe7fba36c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.130180] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1202.130355] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1202.130507] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Deleting the datastore file [datastore1] e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1202.130756] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-18e2f33d-547a-4abe-95da-697237498c5d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.136881] env[67899]: DEBUG oslo_vmware.api [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Waiting for the task: (returnval){ [ 1202.136881] env[67899]: value = "task-3467932" [ 1202.136881] env[67899]: _type = "Task" [ 1202.136881] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1202.144790] env[67899]: DEBUG oslo_vmware.api [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Task: {'id': task-3467932, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1202.520240] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1202.520519] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating directory with path [datastore1] vmware_temp/2dbd524b-0833-421e-9d37-5501cc123edb/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1202.520732] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a5301db8-0394-4f97-b754-48540a7429ae {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.532246] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Created directory with path [datastore1] vmware_temp/2dbd524b-0833-421e-9d37-5501cc123edb/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1202.532442] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Fetch image to [datastore1] vmware_temp/2dbd524b-0833-421e-9d37-5501cc123edb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1202.532625] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/2dbd524b-0833-421e-9d37-5501cc123edb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1202.533587] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f790f12-c554-413f-8f6e-ff7cf88d6121 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.541020] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2130c9e-1eb8-452f-b61a-1afc74796992 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.550040] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e46a73ed-b796-4ff4-9b6e-b9d315588e05 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.580503] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd11d744-df6e-427b-b7e7-3a7ff69c1108 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.586286] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-31d0dfcd-61fb-43f4-817c-2f0f0bf3efd1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.606509] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1202.646362] env[67899]: DEBUG oslo_vmware.api [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Task: {'id': task-3467932, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.045239} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1202.648249] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1202.648433] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1202.648622] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1202.648767] env[67899]: INFO nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1202.649008] env[67899]: DEBUG oslo.service.loopingcall [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1202.649383] env[67899]: DEBUG nova.compute.manager [-] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network deallocation for instance since networking was not requested. {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1202.652771] env[67899]: DEBUG nova.compute.claims [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1202.652995] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1202.653184] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1202.661429] env[67899]: DEBUG oslo_vmware.rw_handles [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2dbd524b-0833-421e-9d37-5501cc123edb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1202.723434] env[67899]: DEBUG oslo_vmware.rw_handles [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1202.723434] env[67899]: DEBUG oslo_vmware.rw_handles [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2dbd524b-0833-421e-9d37-5501cc123edb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1203.035178] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60a42ee4-57b0-4582-8879-d788869f71e9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.042616] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a09d311-1f97-4743-85a1-b0ed443b3d85 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.072291] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93f9da0e-ac6f-4a79-bde8-6d58a62fefa3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.078767] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a70023ea-1810-4b68-814f-42428a7966e9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.091232] env[67899]: DEBUG nova.compute.provider_tree [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1203.099694] env[67899]: DEBUG nova.scheduler.client.report [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1203.115012] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.462s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1203.115524] env[67899]: ERROR nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1203.115524] env[67899]: Faults: ['InvalidArgument'] [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Traceback (most recent call last): [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] self.driver.spawn(context, instance, image_meta, [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] self._fetch_image_if_missing(context, vi) [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] image_cache(vi, tmp_image_ds_loc) [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] vm_util.copy_virtual_disk( [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] session._wait_for_task(vmdk_copy_task) [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] return self.wait_for_task(task_ref) [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] return evt.wait() [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] result = hub.switch() [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] return self.greenlet.switch() [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] self.f(*self.args, **self.kw) [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] raise exceptions.translate_fault(task_info.error) [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Faults: ['InvalidArgument'] [ 1203.115524] env[67899]: ERROR nova.compute.manager [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] [ 1203.116406] env[67899]: DEBUG nova.compute.utils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1203.117574] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Build of instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf was re-scheduled: A specified parameter was not correct: fileType [ 1203.117574] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1203.117941] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1203.118188] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "refresh_cache-e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1203.118348] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquired lock "refresh_cache-e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1203.118512] env[67899]: DEBUG nova.network.neutron [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1203.141853] env[67899]: DEBUG nova.network.neutron [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1203.199383] env[67899]: DEBUG nova.network.neutron [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1203.208205] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Releasing lock "refresh_cache-e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1203.208405] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1203.208579] env[67899]: DEBUG nova.compute.manager [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Skipping network deallocation for instance since networking was not requested. {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1203.289748] env[67899]: INFO nova.scheduler.client.report [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Deleted allocations for instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf [ 1203.307265] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aafabe7-8144-45c2-a7ff-cc06a3819f06 tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 563.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1203.308349] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 363.926s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1203.308567] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1203.308770] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1203.308933] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1203.310757] env[67899]: INFO nova.compute.manager [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Terminating instance [ 1203.312314] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquiring lock "refresh_cache-e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1203.312466] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Acquired lock "refresh_cache-e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1203.312628] env[67899]: DEBUG nova.network.neutron [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1203.320449] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1203.337792] env[67899]: DEBUG nova.network.neutron [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1203.368727] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1203.368964] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1203.370337] env[67899]: INFO nova.compute.claims [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1203.397880] env[67899]: DEBUG nova.network.neutron [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1203.407260] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Releasing lock "refresh_cache-e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1203.407627] env[67899]: DEBUG nova.compute.manager [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1203.407809] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1203.408292] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0a0431fb-94d5-453a-a313-1e360cf57dc7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.417566] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c0350c6-ed56-4233-abca-a1cd436ba78e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.447182] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf could not be found. [ 1203.447386] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1203.447556] env[67899]: INFO nova.compute.manager [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1203.447790] env[67899]: DEBUG oslo.service.loopingcall [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1203.450230] env[67899]: DEBUG nova.compute.manager [-] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1203.450331] env[67899]: DEBUG nova.network.neutron [-] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1203.471034] env[67899]: DEBUG nova.network.neutron [-] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1203.480596] env[67899]: DEBUG nova.network.neutron [-] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1203.491106] env[67899]: INFO nova.compute.manager [-] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] Took 0.04 seconds to deallocate network for instance. [ 1203.593464] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1286ff4f-9afc-4e0a-adf5-531abb93692f tempest-ServersAdmin275Test-1560987778 tempest-ServersAdmin275Test-1560987778-project-member] Lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.285s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1203.595130] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 28.576s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1203.595305] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf] During sync_power_state the instance has a pending task (deleting). Skip. [ 1203.595483] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "e1401ec0-00f4-42fc-b8e5-a3b45fae3bcf" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1203.847094] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a396591-7357-4413-89cf-812042dd7371 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.854432] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75117ec4-d06a-4d97-9178-8db042bc3a11 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.884248] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16ed2f46-e8c8-4022-b2db-1f4ba262a4b6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.891629] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10522d4f-343f-4808-91bd-b5754933de03 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.904415] env[67899]: DEBUG nova.compute.provider_tree [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1203.913130] env[67899]: DEBUG nova.scheduler.client.report [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1203.926600] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.558s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1203.927909] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1203.962449] env[67899]: DEBUG nova.compute.utils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1203.963975] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Not allocating networking since 'none' was specified. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1203.972602] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1204.033074] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1204.059014] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1204.059280] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1204.059435] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1204.059612] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1204.059755] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1204.059898] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1204.060114] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1204.060272] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1204.060435] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1204.060594] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1204.060761] env[67899]: DEBUG nova.virt.hardware [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1204.061636] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78805af1-f51b-4783-9190-f4e901e5383d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.069820] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-651b858c-07e5-44d8-a9ff-cc865a373e24 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.083526] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Instance VIF info [] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1204.088950] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Creating folder: Project (567f878cd0894b9b9331a9b1d47eb541). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1204.089218] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a194f4de-dfa6-4b54-a4eb-7f3268770b29 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.097954] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Created folder: Project (567f878cd0894b9b9331a9b1d47eb541) in parent group-v692900. [ 1204.098144] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Creating folder: Instances. Parent ref: group-v692972. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1204.098467] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-453e3944-78f8-4f28-91ca-46ee572e655c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.106265] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Created folder: Instances in parent group-v692972. [ 1204.106447] env[67899]: DEBUG oslo.service.loopingcall [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1204.106618] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1204.106798] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-183ec695-6d41-4fbd-980f-1857b2060671 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.121687] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1204.121687] env[67899]: value = "task-3467935" [ 1204.121687] env[67899]: _type = "Task" [ 1204.121687] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1204.128487] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467935, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1204.632390] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467935, 'name': CreateVM_Task, 'duration_secs': 0.242449} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1204.632695] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1204.632984] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1204.633178] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1204.633528] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1204.633810] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8960482a-7392-4730-95f1-4ec85f262294 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.638254] env[67899]: DEBUG oslo_vmware.api [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Waiting for the task: (returnval){ [ 1204.638254] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]529447ec-8d61-e959-be77-1cae4104d740" [ 1204.638254] env[67899]: _type = "Task" [ 1204.638254] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1204.646049] env[67899]: DEBUG oslo_vmware.api [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]529447ec-8d61-e959-be77-1cae4104d740, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1205.148870] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1205.149156] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1205.149369] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1208.662365] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "c7ad553b-2149-4211-aee3-057ea83069f5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1219.035715] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1219.996943] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1220.997339] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1220.997689] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1220.997689] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1221.027765] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.027911] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.028054] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.028182] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.028308] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.028448] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.028582] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.028702] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.028821] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.028938] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1221.029073] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1221.029571] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1221.996818] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1221.996818] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1221.996974] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1222.834097] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquiring lock "dc7bf2b7-631d-4933-92db-1679ad823379" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1222.834433] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "dc7bf2b7-631d-4933-92db-1679ad823379" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1222.997698] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1223.854739] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquiring lock "8a157747-34e2-48f7-bf21-d17810122954" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1223.855054] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "8a157747-34e2-48f7-bf21-d17810122954" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1223.995847] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1224.997055] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1225.008662] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1225.008903] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1225.009076] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1225.009232] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1225.010356] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24acc2a1-8446-4132-af4b-3f0887e2326b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.019178] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ff1298d-291b-4776-9eb1-78c6224b1a50 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.032897] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0693db28-ae41-4223-9013-be21fd8db66f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.039461] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-935d5298-046f-4b3a-bbcb-45460e2656f2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.068584] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180938MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1225.068746] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1225.068934] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1225.145255] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.145448] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.145542] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.145661] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.145806] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.145890] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.146474] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.146474] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.146474] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.146474] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1225.157480] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance db21b229-2664-4947-96c8-c1e92f97917e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.169120] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 94ebdda8-5b9c-4ffa-be45-571ec9ba9f81 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.179440] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 928c018d-ec75-42c6-8e55-e38bb5947bcf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.189528] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 04bee4b3-88b9-4f8c-b5d7-3955a158a2d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.199870] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a43ea307-5b84-4c8c-9f28-255980bfd51a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.210257] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.220831] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 97ec7119-2dc8-49ac-921f-b28d04ffd056 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.230068] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance f03010e7-fd45-4959-b6fb-4c7b3fc833c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.239267] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3526174d-17e3-4a54-92dc-0556334ce315 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.251026] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cf23465f-b46c-4360-8949-2af3b9ba44c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.259487] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 2d202778-6d31-4d2f-b249-60925737da42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.269554] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5ead1ba5-49a8-41a8-b984-cb5408683a25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.278904] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8e66e2d5-aa60-474f-b77f-4a477e2d0f8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.305327] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a82e877-8a39-4684-8b75-711b7bedddac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.315638] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.325292] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8a157747-34e2-48f7-bf21-d17810122954 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1225.325529] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1225.325673] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1225.609281] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fb14e14-2985-4921-8326-0a3b7fcc05d1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.616947] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cf7df19-b0f5-48bc-b6a8-9e2ca13dcf98 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.646081] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c7c69a4-7afb-4712-bb0d-d5f5b5a7f2f7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.653545] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29e2705c-ce50-4484-9993-1db59fee17fc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.666886] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1225.676348] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1225.690779] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1225.690950] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.622s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1230.694209] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquiring lock "03684169-e2c8-4cf5-8e79-b118725927f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1230.694557] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "03684169-e2c8-4cf5-8e79-b118725927f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1239.334548] env[67899]: DEBUG oslo_concurrency.lockutils [None req-65de0df3-895b-4fbd-8d87-17ee086072d5 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "49c65e6c-9e16-40f5-9754-fe81681f9714" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1239.334854] env[67899]: DEBUG oslo_concurrency.lockutils [None req-65de0df3-895b-4fbd-8d87-17ee086072d5 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "49c65e6c-9e16-40f5-9754-fe81681f9714" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1241.110904] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e46202ba-29e3-4bfd-9e7d-4f47bc76deae tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] Acquiring lock "ac2f9cf9-f573-4f21-aeb4-6cea5c94f843" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1241.111242] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e46202ba-29e3-4bfd-9e7d-4f47bc76deae tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] Lock "ac2f9cf9-f573-4f21-aeb4-6cea5c94f843" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1244.296850] env[67899]: DEBUG oslo_concurrency.lockutils [None req-82dbbae0-b072-4fbd-860f-b156dd250541 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "5bb22bfa-4f1f-42a8-a7e3-5e806c70ae45" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1244.297157] env[67899]: DEBUG oslo_concurrency.lockutils [None req-82dbbae0-b072-4fbd-860f-b156dd250541 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "5bb22bfa-4f1f-42a8-a7e3-5e806c70ae45" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1249.084958] env[67899]: WARNING oslo_vmware.rw_handles [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1249.084958] env[67899]: ERROR oslo_vmware.rw_handles [ 1249.085839] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/2dbd524b-0833-421e-9d37-5501cc123edb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1249.087766] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1249.088059] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Copying Virtual Disk [datastore1] vmware_temp/2dbd524b-0833-421e-9d37-5501cc123edb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/2dbd524b-0833-421e-9d37-5501cc123edb/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1249.088373] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-35dde69f-5a3b-4b85-84ee-e68ee9cf1538 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.096654] env[67899]: DEBUG oslo_vmware.api [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 1249.096654] env[67899]: value = "task-3467936" [ 1249.096654] env[67899]: _type = "Task" [ 1249.096654] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1249.108631] env[67899]: DEBUG oslo_vmware.api [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': task-3467936, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1249.610804] env[67899]: DEBUG oslo_vmware.exceptions [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1249.611109] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1249.611672] env[67899]: ERROR nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1249.611672] env[67899]: Faults: ['InvalidArgument'] [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Traceback (most recent call last): [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] yield resources [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] self.driver.spawn(context, instance, image_meta, [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] self._fetch_image_if_missing(context, vi) [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] image_cache(vi, tmp_image_ds_loc) [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] vm_util.copy_virtual_disk( [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] session._wait_for_task(vmdk_copy_task) [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] return self.wait_for_task(task_ref) [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] return evt.wait() [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] result = hub.switch() [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] return self.greenlet.switch() [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] self.f(*self.args, **self.kw) [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] raise exceptions.translate_fault(task_info.error) [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Faults: ['InvalidArgument'] [ 1249.611672] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] [ 1249.612635] env[67899]: INFO nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Terminating instance [ 1249.613536] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1249.613878] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1249.614044] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-21e67d58-4b3d-4e82-b178-c497ae0d93ec {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.618637] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1249.618852] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1249.619623] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12fc7808-d62e-42b5-97cb-f71a6eb4f6bd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.629189] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1249.629423] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f191f1ab-d627-43ff-9b82-aed4bb0b2426 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.631651] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1249.631824] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1249.633094] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dd528b40-8652-48f0-8828-5635e2c17a77 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.637837] env[67899]: DEBUG oslo_vmware.api [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Waiting for the task: (returnval){ [ 1249.637837] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]527d35f9-d0bf-7c64-85c6-7de237480f42" [ 1249.637837] env[67899]: _type = "Task" [ 1249.637837] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1249.646447] env[67899]: DEBUG oslo_vmware.api [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]527d35f9-d0bf-7c64-85c6-7de237480f42, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1249.700061] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1249.700545] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1249.700798] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Deleting the datastore file [datastore1] 4458efe7-18d4-4cfb-b131-e09d36124d68 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1249.701159] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-831d0cdb-e687-4403-b900-7d8d031fa95b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.707380] env[67899]: DEBUG oslo_vmware.api [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 1249.707380] env[67899]: value = "task-3467938" [ 1249.707380] env[67899]: _type = "Task" [ 1249.707380] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1249.716413] env[67899]: DEBUG oslo_vmware.api [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': task-3467938, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1250.149299] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1250.149598] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Creating directory with path [datastore1] vmware_temp/74b3c9a9-66b5-4ca2-8a36-2d56e90de334/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1250.149804] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a2cc4939-7eab-4df3-942d-a2b32d59010b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.167485] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Created directory with path [datastore1] vmware_temp/74b3c9a9-66b5-4ca2-8a36-2d56e90de334/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1250.167745] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Fetch image to [datastore1] vmware_temp/74b3c9a9-66b5-4ca2-8a36-2d56e90de334/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1250.167895] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/74b3c9a9-66b5-4ca2-8a36-2d56e90de334/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1250.168738] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-311ca2c1-b1e8-4453-992e-cfe32790aed7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.175706] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb183377-8fe2-49ad-8534-e93eb9f5c29f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.185071] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd319a4c-e725-44a6-a04f-82c28c40368b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.218872] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2976fd9-fd33-4d42-a333-100c4ec97ed3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.227381] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a0598662-71cd-46f0-9013-e6ea20c7343a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.229071] env[67899]: DEBUG oslo_vmware.api [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': task-3467938, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.086376} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1250.229314] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1250.229492] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1250.229663] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1250.229830] env[67899]: INFO nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1250.231853] env[67899]: DEBUG nova.compute.claims [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1250.232033] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1250.232249] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1250.255455] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1250.556234] env[67899]: DEBUG oslo_vmware.rw_handles [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74b3c9a9-66b5-4ca2-8a36-2d56e90de334/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1250.620388] env[67899]: DEBUG oslo_vmware.rw_handles [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1250.620587] env[67899]: DEBUG oslo_vmware.rw_handles [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74b3c9a9-66b5-4ca2-8a36-2d56e90de334/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1250.698309] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d63dcf3-2fa9-4c8c-acd2-ed27c232c4da {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.705408] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2b3f606-3eaa-472f-947a-f19d777d0cdf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.737432] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e6d5712-39b8-4468-be6d-849697ef33f0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.744315] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-550eccf4-448d-414b-9874-3a9d7f58887d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.757020] env[67899]: DEBUG nova.compute.provider_tree [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1250.766400] env[67899]: DEBUG nova.scheduler.client.report [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1250.783301] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.551s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1250.783835] env[67899]: ERROR nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1250.783835] env[67899]: Faults: ['InvalidArgument'] [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Traceback (most recent call last): [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] self.driver.spawn(context, instance, image_meta, [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] self._fetch_image_if_missing(context, vi) [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] image_cache(vi, tmp_image_ds_loc) [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] vm_util.copy_virtual_disk( [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] session._wait_for_task(vmdk_copy_task) [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] return self.wait_for_task(task_ref) [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] return evt.wait() [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] result = hub.switch() [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] return self.greenlet.switch() [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] self.f(*self.args, **self.kw) [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] raise exceptions.translate_fault(task_info.error) [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Faults: ['InvalidArgument'] [ 1250.783835] env[67899]: ERROR nova.compute.manager [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] [ 1250.784726] env[67899]: DEBUG nova.compute.utils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1250.786043] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Build of instance 4458efe7-18d4-4cfb-b131-e09d36124d68 was re-scheduled: A specified parameter was not correct: fileType [ 1250.786043] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1250.786465] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1250.786608] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1250.786784] env[67899]: DEBUG nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1250.786948] env[67899]: DEBUG nova.network.neutron [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1251.149141] env[67899]: DEBUG nova.network.neutron [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1251.162173] env[67899]: INFO nova.compute.manager [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Took 0.37 seconds to deallocate network for instance. [ 1251.257084] env[67899]: INFO nova.scheduler.client.report [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Deleted allocations for instance 4458efe7-18d4-4cfb-b131-e09d36124d68 [ 1251.282135] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e5513ee2-6d9e-4ef6-b3a5-69d04367ec0c tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 605.724s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.283119] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 406.814s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1251.283355] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "4458efe7-18d4-4cfb-b131-e09d36124d68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1251.283559] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1251.283727] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.285751] env[67899]: INFO nova.compute.manager [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Terminating instance [ 1251.287431] env[67899]: DEBUG nova.compute.manager [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1251.287628] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1251.288112] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9ebbe249-b913-4ce1-a28b-154087495385 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1251.297088] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c0fc38c-69dd-458e-85a6-0089fc7b2e2d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1251.307735] env[67899]: DEBUG nova.compute.manager [None req-d5b8e164-f48b-45b4-a730-66d8dd108eb7 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: b9df90e1-da9a-47c3-8920-84f20ef5c588] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1251.328356] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4458efe7-18d4-4cfb-b131-e09d36124d68 could not be found. [ 1251.328553] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1251.328731] env[67899]: INFO nova.compute.manager [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1251.328975] env[67899]: DEBUG oslo.service.loopingcall [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1251.329219] env[67899]: DEBUG nova.compute.manager [-] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1251.329316] env[67899]: DEBUG nova.network.neutron [-] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1251.331535] env[67899]: DEBUG nova.compute.manager [None req-d5b8e164-f48b-45b4-a730-66d8dd108eb7 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: b9df90e1-da9a-47c3-8920-84f20ef5c588] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1251.351928] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d5b8e164-f48b-45b4-a730-66d8dd108eb7 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "b9df90e1-da9a-47c3-8920-84f20ef5c588" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.355s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.353254] env[67899]: DEBUG nova.network.neutron [-] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1251.360962] env[67899]: INFO nova.compute.manager [-] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] Took 0.03 seconds to deallocate network for instance. [ 1251.361270] env[67899]: DEBUG nova.compute.manager [None req-faf2a278-e187-4285-9418-771dae793d05 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: 9b4ec0f5-35d7-4ba9-bc46-47cd2a73219c] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1251.381895] env[67899]: DEBUG nova.compute.manager [None req-faf2a278-e187-4285-9418-771dae793d05 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] [instance: 9b4ec0f5-35d7-4ba9-bc46-47cd2a73219c] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1251.401754] env[67899]: DEBUG oslo_concurrency.lockutils [None req-faf2a278-e187-4285-9418-771dae793d05 tempest-ServersTestMultiNic-1718995190 tempest-ServersTestMultiNic-1718995190-project-member] Lock "9b4ec0f5-35d7-4ba9-bc46-47cd2a73219c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.471s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.413225] env[67899]: DEBUG nova.compute.manager [None req-9561e73b-0abb-4ff8-958b-bc925c7916af tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: ce0c59ed-7bb2-49cc-a158-dda0da4f88cf] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1251.447835] env[67899]: DEBUG nova.compute.manager [None req-9561e73b-0abb-4ff8-958b-bc925c7916af tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: ce0c59ed-7bb2-49cc-a158-dda0da4f88cf] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1251.466887] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1a745fb-fb91-4883-a69a-2732d312f7cd tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.184s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.468433] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 76.449s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1251.468623] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4458efe7-18d4-4cfb-b131-e09d36124d68] During sync_power_state the instance has a pending task (deleting). Skip. [ 1251.468696] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "4458efe7-18d4-4cfb-b131-e09d36124d68" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.476231] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9561e73b-0abb-4ff8-958b-bc925c7916af tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "ce0c59ed-7bb2-49cc-a158-dda0da4f88cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.383s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.484913] env[67899]: DEBUG nova.compute.manager [None req-06314e14-3ed8-4bad-9823-95f7a4342101 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: db21b229-2664-4947-96c8-c1e92f97917e] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1251.506409] env[67899]: DEBUG nova.compute.manager [None req-06314e14-3ed8-4bad-9823-95f7a4342101 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: db21b229-2664-4947-96c8-c1e92f97917e] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1251.528051] env[67899]: DEBUG oslo_concurrency.lockutils [None req-06314e14-3ed8-4bad-9823-95f7a4342101 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "db21b229-2664-4947-96c8-c1e92f97917e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.833s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.536311] env[67899]: DEBUG nova.compute.manager [None req-90e84af9-2c10-4820-98ac-ee806bc146c3 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 94ebdda8-5b9c-4ffa-be45-571ec9ba9f81] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1251.560050] env[67899]: DEBUG nova.compute.manager [None req-90e84af9-2c10-4820-98ac-ee806bc146c3 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 94ebdda8-5b9c-4ffa-be45-571ec9ba9f81] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1251.581048] env[67899]: DEBUG oslo_concurrency.lockutils [None req-90e84af9-2c10-4820-98ac-ee806bc146c3 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "94ebdda8-5b9c-4ffa-be45-571ec9ba9f81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.665s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.589477] env[67899]: DEBUG nova.compute.manager [None req-ded5771e-a4f5-4a81-924b-bdb96277cb6f tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] [instance: 928c018d-ec75-42c6-8e55-e38bb5947bcf] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1251.614881] env[67899]: DEBUG nova.compute.manager [None req-ded5771e-a4f5-4a81-924b-bdb96277cb6f tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] [instance: 928c018d-ec75-42c6-8e55-e38bb5947bcf] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1251.634880] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ded5771e-a4f5-4a81-924b-bdb96277cb6f tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] Lock "928c018d-ec75-42c6-8e55-e38bb5947bcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.612s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.643905] env[67899]: DEBUG nova.compute.manager [None req-1f3aed5d-5130-4849-8586-e54f4f6d3927 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: 04bee4b3-88b9-4f8c-b5d7-3955a158a2d5] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1251.667780] env[67899]: DEBUG nova.compute.manager [None req-1f3aed5d-5130-4849-8586-e54f4f6d3927 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: 04bee4b3-88b9-4f8c-b5d7-3955a158a2d5] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1251.688858] env[67899]: DEBUG oslo_concurrency.lockutils [None req-1f3aed5d-5130-4849-8586-e54f4f6d3927 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "04bee4b3-88b9-4f8c-b5d7-3955a158a2d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.768s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.697428] env[67899]: DEBUG nova.compute.manager [None req-bf08a2ad-94a1-4a30-a63d-7b81c98afb6a tempest-ServerActionsTestOtherA-1954250680 tempest-ServerActionsTestOtherA-1954250680-project-member] [instance: a43ea307-5b84-4c8c-9f28-255980bfd51a] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1251.724088] env[67899]: DEBUG nova.compute.manager [None req-bf08a2ad-94a1-4a30-a63d-7b81c98afb6a tempest-ServerActionsTestOtherA-1954250680 tempest-ServerActionsTestOtherA-1954250680-project-member] [instance: a43ea307-5b84-4c8c-9f28-255980bfd51a] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1251.745192] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bf08a2ad-94a1-4a30-a63d-7b81c98afb6a tempest-ServerActionsTestOtherA-1954250680 tempest-ServerActionsTestOtherA-1954250680-project-member] Lock "a43ea307-5b84-4c8c-9f28-255980bfd51a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.600s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.754119] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1251.805810] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1251.806127] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1251.807702] env[67899]: INFO nova.compute.claims [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1252.134530] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-412cc54b-da78-4cc8-a420-0e57c581d0b3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.143455] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e12d246-3328-4afb-a075-ca02d6e80995 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.175449] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cec2b21-09b2-47b6-a58f-8177acb875a5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.182844] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be8ded1d-8772-45ba-a2c5-d133cd17b4f0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.195958] env[67899]: DEBUG nova.compute.provider_tree [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1252.207263] env[67899]: DEBUG nova.scheduler.client.report [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1252.220747] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.415s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1252.221435] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1252.254057] env[67899]: DEBUG nova.compute.utils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1252.256709] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1252.256951] env[67899]: DEBUG nova.network.neutron [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1252.265211] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1252.313457] env[67899]: DEBUG nova.policy [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8f6e68af5f7147f9a8080d720a834a56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a6ddbe6f15c6436197b1b073170d78cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1252.334330] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1252.359781] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1252.360061] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1252.360209] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1252.360387] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1252.360559] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1252.360675] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1252.360876] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1252.361043] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1252.361214] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1252.361371] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1252.361537] env[67899]: DEBUG nova.virt.hardware [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1252.362499] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8164e7d4-f1ad-4252-afde-e49d83389d37 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.370654] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b852d6a3-42a7-41bc-b2ee-d5adfc3ab7d8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.631722] env[67899]: DEBUG nova.network.neutron [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Successfully created port: 6fe8fb78-a839-49ba-81dd-3db937c90b2c {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1253.296405] env[67899]: DEBUG nova.compute.manager [req-5c09da8a-81ef-4977-ba05-d70c484581d4 req-b228578d-0819-498d-8429-3d119cd3a241 service nova] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Received event network-vif-plugged-6fe8fb78-a839-49ba-81dd-3db937c90b2c {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1253.296840] env[67899]: DEBUG oslo_concurrency.lockutils [req-5c09da8a-81ef-4977-ba05-d70c484581d4 req-b228578d-0819-498d-8429-3d119cd3a241 service nova] Acquiring lock "6fda2654-4579-4b9a-a97c-97e0128fff14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1253.296990] env[67899]: DEBUG oslo_concurrency.lockutils [req-5c09da8a-81ef-4977-ba05-d70c484581d4 req-b228578d-0819-498d-8429-3d119cd3a241 service nova] Lock "6fda2654-4579-4b9a-a97c-97e0128fff14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1253.297250] env[67899]: DEBUG oslo_concurrency.lockutils [req-5c09da8a-81ef-4977-ba05-d70c484581d4 req-b228578d-0819-498d-8429-3d119cd3a241 service nova] Lock "6fda2654-4579-4b9a-a97c-97e0128fff14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1253.297450] env[67899]: DEBUG nova.compute.manager [req-5c09da8a-81ef-4977-ba05-d70c484581d4 req-b228578d-0819-498d-8429-3d119cd3a241 service nova] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] No waiting events found dispatching network-vif-plugged-6fe8fb78-a839-49ba-81dd-3db937c90b2c {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1253.297674] env[67899]: WARNING nova.compute.manager [req-5c09da8a-81ef-4977-ba05-d70c484581d4 req-b228578d-0819-498d-8429-3d119cd3a241 service nova] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Received unexpected event network-vif-plugged-6fe8fb78-a839-49ba-81dd-3db937c90b2c for instance with vm_state building and task_state spawning. [ 1253.367213] env[67899]: DEBUG nova.network.neutron [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Successfully updated port: 6fe8fb78-a839-49ba-81dd-3db937c90b2c {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1253.380566] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "refresh_cache-6fda2654-4579-4b9a-a97c-97e0128fff14" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1253.380891] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "refresh_cache-6fda2654-4579-4b9a-a97c-97e0128fff14" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1253.380891] env[67899]: DEBUG nova.network.neutron [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1253.435623] env[67899]: DEBUG nova.network.neutron [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1253.853294] env[67899]: DEBUG nova.network.neutron [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Updating instance_info_cache with network_info: [{"id": "6fe8fb78-a839-49ba-81dd-3db937c90b2c", "address": "fa:16:3e:6b:0e:e6", "network": {"id": "857be8e0-b3fa-4836-87d8-37b0af1d0354", "bridge": "br-int", "label": "tempest-ImagesTestJSON-566779850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a6ddbe6f15c6436197b1b073170d78cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6fe8fb78-a8", "ovs_interfaceid": "6fe8fb78-a839-49ba-81dd-3db937c90b2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1253.871979] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "refresh_cache-6fda2654-4579-4b9a-a97c-97e0128fff14" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1253.871979] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Instance network_info: |[{"id": "6fe8fb78-a839-49ba-81dd-3db937c90b2c", "address": "fa:16:3e:6b:0e:e6", "network": {"id": "857be8e0-b3fa-4836-87d8-37b0af1d0354", "bridge": "br-int", "label": "tempest-ImagesTestJSON-566779850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a6ddbe6f15c6436197b1b073170d78cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6fe8fb78-a8", "ovs_interfaceid": "6fe8fb78-a839-49ba-81dd-3db937c90b2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1253.871979] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6b:0e:e6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '19598cc1-e105-4565-906a-09dde75e3fbe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6fe8fb78-a839-49ba-81dd-3db937c90b2c', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1253.879098] env[67899]: DEBUG oslo.service.loopingcall [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1253.880031] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1253.880031] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a0ee8add-46ab-4247-a64d-674ed79f5aed {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1253.901878] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1253.901878] env[67899]: value = "task-3467939" [ 1253.901878] env[67899]: _type = "Task" [ 1253.901878] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1253.910222] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467939, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1254.411945] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467939, 'name': CreateVM_Task, 'duration_secs': 0.307806} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1254.412255] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1254.420693] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1254.420889] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1254.421300] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1254.421842] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3609a727-fe60-40a7-96de-ddafc2e319e8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1254.426575] env[67899]: DEBUG oslo_vmware.api [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 1254.426575] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52dacfac-63b9-c13b-fdb9-b942c0c4285f" [ 1254.426575] env[67899]: _type = "Task" [ 1254.426575] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1254.438632] env[67899]: DEBUG oslo_vmware.api [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52dacfac-63b9-c13b-fdb9-b942c0c4285f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1254.936584] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1254.936818] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1254.937917] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1255.349418] env[67899]: DEBUG nova.compute.manager [req-de0affd0-1f00-4416-8163-273cbd5904ec req-adb7aa34-9734-43c3-9fe8-e653dbc716b7 service nova] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Received event network-changed-6fe8fb78-a839-49ba-81dd-3db937c90b2c {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1255.349615] env[67899]: DEBUG nova.compute.manager [req-de0affd0-1f00-4416-8163-273cbd5904ec req-adb7aa34-9734-43c3-9fe8-e653dbc716b7 service nova] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Refreshing instance network info cache due to event network-changed-6fe8fb78-a839-49ba-81dd-3db937c90b2c. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1255.349828] env[67899]: DEBUG oslo_concurrency.lockutils [req-de0affd0-1f00-4416-8163-273cbd5904ec req-adb7aa34-9734-43c3-9fe8-e653dbc716b7 service nova] Acquiring lock "refresh_cache-6fda2654-4579-4b9a-a97c-97e0128fff14" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1255.349967] env[67899]: DEBUG oslo_concurrency.lockutils [req-de0affd0-1f00-4416-8163-273cbd5904ec req-adb7aa34-9734-43c3-9fe8-e653dbc716b7 service nova] Acquired lock "refresh_cache-6fda2654-4579-4b9a-a97c-97e0128fff14" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1255.350141] env[67899]: DEBUG nova.network.neutron [req-de0affd0-1f00-4416-8163-273cbd5904ec req-adb7aa34-9734-43c3-9fe8-e653dbc716b7 service nova] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Refreshing network info cache for port 6fe8fb78-a839-49ba-81dd-3db937c90b2c {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1255.654082] env[67899]: DEBUG nova.network.neutron [req-de0affd0-1f00-4416-8163-273cbd5904ec req-adb7aa34-9734-43c3-9fe8-e653dbc716b7 service nova] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Updated VIF entry in instance network info cache for port 6fe8fb78-a839-49ba-81dd-3db937c90b2c. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1255.654513] env[67899]: DEBUG nova.network.neutron [req-de0affd0-1f00-4416-8163-273cbd5904ec req-adb7aa34-9734-43c3-9fe8-e653dbc716b7 service nova] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Updating instance_info_cache with network_info: [{"id": "6fe8fb78-a839-49ba-81dd-3db937c90b2c", "address": "fa:16:3e:6b:0e:e6", "network": {"id": "857be8e0-b3fa-4836-87d8-37b0af1d0354", "bridge": "br-int", "label": "tempest-ImagesTestJSON-566779850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a6ddbe6f15c6436197b1b073170d78cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6fe8fb78-a8", "ovs_interfaceid": "6fe8fb78-a839-49ba-81dd-3db937c90b2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1255.663898] env[67899]: DEBUG oslo_concurrency.lockutils [req-de0affd0-1f00-4416-8163-273cbd5904ec req-adb7aa34-9734-43c3-9fe8-e653dbc716b7 service nova] Releasing lock "refresh_cache-6fda2654-4579-4b9a-a97c-97e0128fff14" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1257.456781] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "6fda2654-4579-4b9a-a97c-97e0128fff14" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1259.948488] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquiring lock "3a077713-f7a2-4a61-bb17-987af6a52c4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1259.948831] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1260.158481] env[67899]: DEBUG oslo_concurrency.lockutils [None req-67bbdfd5-5385-4e85-b8cb-1f97b40f5bad tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquiring lock "53a54716-a3cd-4234-977d-0c82370025d1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1260.159210] env[67899]: DEBUG oslo_concurrency.lockutils [None req-67bbdfd5-5385-4e85-b8cb-1f97b40f5bad tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "53a54716-a3cd-4234-977d-0c82370025d1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1272.634284] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3149d882-2ab5-4849-a2dc-744547851dfa tempest-ServerActionsTestOtherB-1017232890 tempest-ServerActionsTestOtherB-1017232890-project-member] Acquiring lock "5d8b3009-ba5e-4f29-81fc-6d389ec30808" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1272.634596] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3149d882-2ab5-4849-a2dc-744547851dfa tempest-ServerActionsTestOtherB-1017232890 tempest-ServerActionsTestOtherB-1017232890-project-member] Lock "5d8b3009-ba5e-4f29-81fc-6d389ec30808" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1280.686140] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1280.996812] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1280.997062] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1280.997206] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1281.022778] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.022943] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.023087] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.023216] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.023340] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.023459] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.023576] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.023693] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.023811] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.023925] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1281.024054] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1281.024505] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1281.996794] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1281.997167] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1282.991815] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1283.012835] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1283.996285] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1283.996462] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1285.996594] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1286.435975] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d9ba9f59-d540-457f-98ff-d4f3cc4ec481 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "a292a68e-deff-465b-81f0-727e75c2e212" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1286.436275] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d9ba9f59-d540-457f-98ff-d4f3cc4ec481 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "a292a68e-deff-465b-81f0-727e75c2e212" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1286.996338] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1287.008170] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1287.008481] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1287.008658] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1287.008806] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1287.009913] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-753ee5c9-5f29-4311-a6dc-c2c639deef72 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.019226] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80eebf08-b669-483d-afb7-10ad40cc5dde {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.032852] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2807bfc0-b87c-471d-bf61-7c44d72fbf44 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.038913] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e260488-95cf-4fcb-abb2-d7f16f09d9e5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.067238] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1287.067374] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1287.067558] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1287.146063] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.146063] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.146184] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.146315] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.146494] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.146615] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.146780] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.146968] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.147127] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.147281] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1287.159664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5ead1ba5-49a8-41a8-b984-cb5408683a25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.170463] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8e66e2d5-aa60-474f-b77f-4a477e2d0f8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.180371] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a82e877-8a39-4684-8b75-711b7bedddac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.191572] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.200362] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8a157747-34e2-48f7-bf21-d17810122954 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.210010] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.218563] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 49c65e6c-9e16-40f5-9754-fe81681f9714 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.227021] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ac2f9cf9-f573-4f21-aeb4-6cea5c94f843 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.235804] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5bb22bfa-4f1f-42a8-a7e3-5e806c70ae45 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.244500] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.253636] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 53a54716-a3cd-4234-977d-0c82370025d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.263643] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5d8b3009-ba5e-4f29-81fc-6d389ec30808 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.273564] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a292a68e-deff-465b-81f0-727e75c2e212 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1287.273804] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1287.273948] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1287.535451] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e347475-a456-4334-833d-b807223667aa {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.543842] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d0b2425-7397-474a-9495-892a498d2d70 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.573905] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0774793d-cfca-432f-a160-397a81ef8801 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.581252] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f7b0a47-96fc-425c-bab4-185d7652dbf8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.594214] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1287.602723] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1287.619147] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1287.619368] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.552s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1297.078118] env[67899]: WARNING oslo_vmware.rw_handles [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1297.078118] env[67899]: ERROR oslo_vmware.rw_handles [ 1297.078670] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/74b3c9a9-66b5-4ca2-8a36-2d56e90de334/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1297.080651] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1297.080914] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Copying Virtual Disk [datastore1] vmware_temp/74b3c9a9-66b5-4ca2-8a36-2d56e90de334/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/74b3c9a9-66b5-4ca2-8a36-2d56e90de334/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1297.081267] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cdd57418-d0cb-4939-a22f-a83f524b0b2f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.089549] env[67899]: DEBUG oslo_vmware.api [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Waiting for the task: (returnval){ [ 1297.089549] env[67899]: value = "task-3467940" [ 1297.089549] env[67899]: _type = "Task" [ 1297.089549] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1297.101548] env[67899]: DEBUG oslo_vmware.api [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Task: {'id': task-3467940, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1297.603591] env[67899]: DEBUG oslo_vmware.exceptions [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1297.604066] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1297.604685] env[67899]: ERROR nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1297.604685] env[67899]: Faults: ['InvalidArgument'] [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Traceback (most recent call last): [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] yield resources [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] self.driver.spawn(context, instance, image_meta, [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] self._fetch_image_if_missing(context, vi) [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] image_cache(vi, tmp_image_ds_loc) [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] vm_util.copy_virtual_disk( [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] session._wait_for_task(vmdk_copy_task) [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] return self.wait_for_task(task_ref) [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] return evt.wait() [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] result = hub.switch() [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] return self.greenlet.switch() [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] self.f(*self.args, **self.kw) [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] raise exceptions.translate_fault(task_info.error) [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Faults: ['InvalidArgument'] [ 1297.604685] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] [ 1297.605419] env[67899]: INFO nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Terminating instance [ 1297.606810] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1297.607037] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1297.607425] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6dbdf080-0888-4a50-bd38-412124b5ff8c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.610886] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1297.611042] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1297.611823] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81f8274e-4a00-4070-a6e4-8dca938f666f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.618920] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1297.619175] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0782d9b6-e515-48d6-afc1-754bde569193 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.621532] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1297.621710] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1297.622668] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ef0bbc61-98c0-4b16-b67b-f055883b8600 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.627901] env[67899]: DEBUG oslo_vmware.api [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Waiting for the task: (returnval){ [ 1297.627901] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]523893fa-95d2-0661-5850-3699909d06c7" [ 1297.627901] env[67899]: _type = "Task" [ 1297.627901] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1297.641012] env[67899]: DEBUG oslo_vmware.api [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]523893fa-95d2-0661-5850-3699909d06c7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1297.703598] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1297.704045] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1297.704045] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Deleting the datastore file [datastore1] 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1297.704265] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b33f39a8-c63b-4070-82d2-fd17ad4ee023 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.710727] env[67899]: DEBUG oslo_vmware.api [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Waiting for the task: (returnval){ [ 1297.710727] env[67899]: value = "task-3467942" [ 1297.710727] env[67899]: _type = "Task" [ 1297.710727] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1297.719371] env[67899]: DEBUG oslo_vmware.api [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Task: {'id': task-3467942, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1298.138915] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1298.139186] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Creating directory with path [datastore1] vmware_temp/1dbb10db-09fc-4410-ace1-9cd4be0517d6/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1298.139439] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fd7affda-7ce4-4dc0-8596-518f2573495d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.151176] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Created directory with path [datastore1] vmware_temp/1dbb10db-09fc-4410-ace1-9cd4be0517d6/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1298.151375] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Fetch image to [datastore1] vmware_temp/1dbb10db-09fc-4410-ace1-9cd4be0517d6/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1298.151575] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/1dbb10db-09fc-4410-ace1-9cd4be0517d6/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1298.152328] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a90a323-b93b-4969-95fd-d91922c8c1a7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.159397] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e1936f5-4e0e-49a4-bd1f-30895dca4205 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.168305] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c89993c8-4514-4a1f-a2af-3c747f3f341a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.199410] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38377a1b-37a6-43ca-bee3-e63cd9989be6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.205189] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-941c42e7-5606-4b68-8b45-86bca60eb978 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.218769] env[67899]: DEBUG oslo_vmware.api [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Task: {'id': task-3467942, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075933} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1298.219017] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1298.219205] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1298.219375] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1298.219543] env[67899]: INFO nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1298.221636] env[67899]: DEBUG nova.compute.claims [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1298.221806] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.222025] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1298.230652] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1298.411991] env[67899]: DEBUG oslo_vmware.rw_handles [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1dbb10db-09fc-4410-ace1-9cd4be0517d6/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1298.473388] env[67899]: DEBUG oslo_vmware.rw_handles [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1298.473579] env[67899]: DEBUG oslo_vmware.rw_handles [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1dbb10db-09fc-4410-ace1-9cd4be0517d6/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1298.579113] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee233892-08b9-4c6f-a62e-ffe8b26553a9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.586909] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-298866e6-2834-4ae2-8471-04643d964aea {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.616370] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13152700-d2de-493f-9532-cdcec3af5286 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.623469] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f755d010-6192-4981-8df2-63730e25b051 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.636456] env[67899]: DEBUG nova.compute.provider_tree [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1298.644562] env[67899]: DEBUG nova.scheduler.client.report [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1298.663091] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.441s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1298.663693] env[67899]: ERROR nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1298.663693] env[67899]: Faults: ['InvalidArgument'] [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Traceback (most recent call last): [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] self.driver.spawn(context, instance, image_meta, [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] self._fetch_image_if_missing(context, vi) [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] image_cache(vi, tmp_image_ds_loc) [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] vm_util.copy_virtual_disk( [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] session._wait_for_task(vmdk_copy_task) [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] return self.wait_for_task(task_ref) [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] return evt.wait() [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] result = hub.switch() [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] return self.greenlet.switch() [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] self.f(*self.args, **self.kw) [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] raise exceptions.translate_fault(task_info.error) [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Faults: ['InvalidArgument'] [ 1298.663693] env[67899]: ERROR nova.compute.manager [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] [ 1298.664489] env[67899]: DEBUG nova.compute.utils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1298.666110] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Build of instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee was re-scheduled: A specified parameter was not correct: fileType [ 1298.666110] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1298.666504] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1298.666706] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1298.666915] env[67899]: DEBUG nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1298.667181] env[67899]: DEBUG nova.network.neutron [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1298.992035] env[67899]: DEBUG nova.network.neutron [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1299.003538] env[67899]: INFO nova.compute.manager [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Took 0.34 seconds to deallocate network for instance. [ 1299.103990] env[67899]: INFO nova.scheduler.client.report [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Deleted allocations for instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee [ 1299.124024] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b344734b-8a62-4885-9e7d-af38d446ab1b tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 652.764s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.124892] env[67899]: DEBUG oslo_concurrency.lockutils [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 455.606s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1299.125116] env[67899]: DEBUG oslo_concurrency.lockutils [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Acquiring lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1299.125315] env[67899]: DEBUG oslo_concurrency.lockutils [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1299.125935] env[67899]: DEBUG oslo_concurrency.lockutils [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.127467] env[67899]: INFO nova.compute.manager [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Terminating instance [ 1299.129256] env[67899]: DEBUG nova.compute.manager [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1299.129428] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1299.129956] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6fad982b-e8b7-4617-baeb-28642a591b7b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.140234] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a1f39e7-0749-48c4-8909-65e7eec59cd5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.152338] env[67899]: DEBUG nova.compute.manager [None req-5f2bc5d9-fe94-4b02-bd3a-588599063fca tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] [instance: 97ec7119-2dc8-49ac-921f-b28d04ffd056] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1299.173289] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee could not be found. [ 1299.174026] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1299.174026] env[67899]: INFO nova.compute.manager [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1299.174026] env[67899]: DEBUG oslo.service.loopingcall [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1299.174202] env[67899]: DEBUG nova.compute.manager [-] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1299.174255] env[67899]: DEBUG nova.network.neutron [-] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1299.181202] env[67899]: DEBUG nova.compute.manager [None req-5f2bc5d9-fe94-4b02-bd3a-588599063fca tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] [instance: 97ec7119-2dc8-49ac-921f-b28d04ffd056] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1299.199201] env[67899]: DEBUG nova.network.neutron [-] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1299.204139] env[67899]: DEBUG oslo_concurrency.lockutils [None req-5f2bc5d9-fe94-4b02-bd3a-588599063fca tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] Lock "97ec7119-2dc8-49ac-921f-b28d04ffd056" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.226s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.209240] env[67899]: INFO nova.compute.manager [-] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] Took 0.03 seconds to deallocate network for instance. [ 1299.213953] env[67899]: DEBUG nova.compute.manager [None req-58e55a74-0ae7-4975-a668-f0c164e6d586 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] [instance: f03010e7-fd45-4959-b6fb-4c7b3fc833c5] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1299.237755] env[67899]: DEBUG nova.compute.manager [None req-58e55a74-0ae7-4975-a668-f0c164e6d586 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] [instance: f03010e7-fd45-4959-b6fb-4c7b3fc833c5] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1299.265327] env[67899]: DEBUG oslo_concurrency.lockutils [None req-58e55a74-0ae7-4975-a668-f0c164e6d586 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] Lock "f03010e7-fd45-4959-b6fb-4c7b3fc833c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.779s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.274524] env[67899]: DEBUG nova.compute.manager [None req-9c89c833-8745-4a62-a174-5899420c4e70 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] [instance: 3526174d-17e3-4a54-92dc-0556334ce315] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1299.307602] env[67899]: DEBUG nova.compute.manager [None req-9c89c833-8745-4a62-a174-5899420c4e70 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] [instance: 3526174d-17e3-4a54-92dc-0556334ce315] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1299.321828] env[67899]: DEBUG oslo_concurrency.lockutils [None req-07d725c6-cec6-431b-a0d6-d14f435b98e0 tempest-ServersWithSpecificFlavorTestJSON-1910448084 tempest-ServersWithSpecificFlavorTestJSON-1910448084-project-member] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.197s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.322987] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 124.304s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1299.323577] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 35b19ccb-4996-47a7-b1a7-6ffc9dd867ee] During sync_power_state the instance has a pending task (deleting). Skip. [ 1299.323794] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "35b19ccb-4996-47a7-b1a7-6ffc9dd867ee" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.330907] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9c89c833-8745-4a62-a174-5899420c4e70 tempest-ListServerFiltersTestJSON-144845862 tempest-ListServerFiltersTestJSON-144845862-project-member] Lock "3526174d-17e3-4a54-92dc-0556334ce315" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.002s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.340652] env[67899]: DEBUG nova.compute.manager [None req-c5ad700b-c912-48e9-ad1b-7ebfc9667984 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] [instance: cf23465f-b46c-4360-8949-2af3b9ba44c9] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1299.365514] env[67899]: DEBUG nova.compute.manager [None req-c5ad700b-c912-48e9-ad1b-7ebfc9667984 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] [instance: cf23465f-b46c-4360-8949-2af3b9ba44c9] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1299.387532] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c5ad700b-c912-48e9-ad1b-7ebfc9667984 tempest-AttachVolumeNegativeTest-1960469577 tempest-AttachVolumeNegativeTest-1960469577-project-member] Lock "cf23465f-b46c-4360-8949-2af3b9ba44c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.891s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.396529] env[67899]: DEBUG nova.compute.manager [None req-73aa364e-3e8b-42af-87c1-1be08a2292ea tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 2d202778-6d31-4d2f-b249-60925737da42] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1299.423212] env[67899]: DEBUG nova.compute.manager [None req-73aa364e-3e8b-42af-87c1-1be08a2292ea tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 2d202778-6d31-4d2f-b249-60925737da42] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1299.449263] env[67899]: DEBUG oslo_concurrency.lockutils [None req-73aa364e-3e8b-42af-87c1-1be08a2292ea tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "2d202778-6d31-4d2f-b249-60925737da42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.321s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.459346] env[67899]: DEBUG nova.compute.manager [None req-f67ac1f9-ebb2-48ce-af7a-e9c6130d953f tempest-ServerActionsV293TestJSON-1827485530 tempest-ServerActionsV293TestJSON-1827485530-project-member] [instance: 5ead1ba5-49a8-41a8-b984-cb5408683a25] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1299.487654] env[67899]: DEBUG nova.compute.manager [None req-f67ac1f9-ebb2-48ce-af7a-e9c6130d953f tempest-ServerActionsV293TestJSON-1827485530 tempest-ServerActionsV293TestJSON-1827485530-project-member] [instance: 5ead1ba5-49a8-41a8-b984-cb5408683a25] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1299.508665] env[67899]: DEBUG oslo_concurrency.lockutils [None req-f67ac1f9-ebb2-48ce-af7a-e9c6130d953f tempest-ServerActionsV293TestJSON-1827485530 tempest-ServerActionsV293TestJSON-1827485530-project-member] Lock "5ead1ba5-49a8-41a8-b984-cb5408683a25" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.870s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.517609] env[67899]: DEBUG nova.compute.manager [None req-c755e67f-ad2e-43b3-b059-3c50cb0e4fec tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] [instance: 8e66e2d5-aa60-474f-b77f-4a477e2d0f8e] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1299.540669] env[67899]: DEBUG nova.compute.manager [None req-c755e67f-ad2e-43b3-b059-3c50cb0e4fec tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] [instance: 8e66e2d5-aa60-474f-b77f-4a477e2d0f8e] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1299.559978] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c755e67f-ad2e-43b3-b059-3c50cb0e4fec tempest-AttachVolumeShelveTestJSON-10446956 tempest-AttachVolumeShelveTestJSON-10446956-project-member] Lock "8e66e2d5-aa60-474f-b77f-4a477e2d0f8e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.597s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.567798] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1299.623476] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1299.623731] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1299.625223] env[67899]: INFO nova.compute.claims [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1299.896545] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc3160cb-9c6e-42c2-957f-6e3f9469e4dc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.904176] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dca763a4-8eaa-4cf2-98df-e04857ea7568 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.934798] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f438fa18-f3c6-4422-8011-ee355d1a36ca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.942134] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad052e50-5343-41a8-b0de-9f83e005d441 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.954838] env[67899]: DEBUG nova.compute.provider_tree [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1299.963261] env[67899]: DEBUG nova.scheduler.client.report [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1299.978809] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.355s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.979300] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1300.010812] env[67899]: DEBUG nova.compute.utils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1300.012406] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1300.012547] env[67899]: DEBUG nova.network.neutron [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1300.021423] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1300.078505] env[67899]: DEBUG nova.policy [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cebd2a570bbe45788137b7582da89e93', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '320b366df7f94569bb729c10982ebd90', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1300.086076] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1300.111036] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1300.111295] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1300.111449] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1300.111623] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1300.111764] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1300.111906] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1300.112125] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1300.112316] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1300.112586] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1300.112787] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1300.112962] env[67899]: DEBUG nova.virt.hardware [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1300.113817] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8afe7cef-e6bd-4e9d-9fa2-0c10204ce331 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.122162] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7940007e-8bfe-4ff1-ad2f-439a3eaf5091 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.401282] env[67899]: DEBUG nova.network.neutron [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Successfully created port: 2cbbae14-e717-4433-83e2-4d64d89ccdb5 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1301.041343] env[67899]: DEBUG nova.compute.manager [req-05ecc8f8-7d26-4d81-935c-6439a9f2a951 req-aa0a68e2-ac08-46ad-976a-5ba499c5a330 service nova] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Received event network-vif-plugged-2cbbae14-e717-4433-83e2-4d64d89ccdb5 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1301.041732] env[67899]: DEBUG oslo_concurrency.lockutils [req-05ecc8f8-7d26-4d81-935c-6439a9f2a951 req-aa0a68e2-ac08-46ad-976a-5ba499c5a330 service nova] Acquiring lock "7a82e877-8a39-4684-8b75-711b7bedddac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1301.041998] env[67899]: DEBUG oslo_concurrency.lockutils [req-05ecc8f8-7d26-4d81-935c-6439a9f2a951 req-aa0a68e2-ac08-46ad-976a-5ba499c5a330 service nova] Lock "7a82e877-8a39-4684-8b75-711b7bedddac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1301.042239] env[67899]: DEBUG oslo_concurrency.lockutils [req-05ecc8f8-7d26-4d81-935c-6439a9f2a951 req-aa0a68e2-ac08-46ad-976a-5ba499c5a330 service nova] Lock "7a82e877-8a39-4684-8b75-711b7bedddac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1301.042509] env[67899]: DEBUG nova.compute.manager [req-05ecc8f8-7d26-4d81-935c-6439a9f2a951 req-aa0a68e2-ac08-46ad-976a-5ba499c5a330 service nova] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] No waiting events found dispatching network-vif-plugged-2cbbae14-e717-4433-83e2-4d64d89ccdb5 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1301.042734] env[67899]: WARNING nova.compute.manager [req-05ecc8f8-7d26-4d81-935c-6439a9f2a951 req-aa0a68e2-ac08-46ad-976a-5ba499c5a330 service nova] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Received unexpected event network-vif-plugged-2cbbae14-e717-4433-83e2-4d64d89ccdb5 for instance with vm_state building and task_state spawning. [ 1301.127440] env[67899]: DEBUG nova.network.neutron [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Successfully updated port: 2cbbae14-e717-4433-83e2-4d64d89ccdb5 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1301.140926] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquiring lock "refresh_cache-7a82e877-8a39-4684-8b75-711b7bedddac" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1301.141098] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquired lock "refresh_cache-7a82e877-8a39-4684-8b75-711b7bedddac" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1301.141253] env[67899]: DEBUG nova.network.neutron [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1301.182694] env[67899]: DEBUG nova.network.neutron [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1301.377861] env[67899]: DEBUG nova.network.neutron [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Updating instance_info_cache with network_info: [{"id": "2cbbae14-e717-4433-83e2-4d64d89ccdb5", "address": "fa:16:3e:5d:a0:f5", "network": {"id": "759168f7-c8c8-49ca-bdbd-a3883250a6e2", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2046370020-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "320b366df7f94569bb729c10982ebd90", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7ab8d568-adb0-4f3b-b6cc-68413e6546ae", "external-id": "nsx-vlan-transportzone-86", "segmentation_id": 86, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2cbbae14-e7", "ovs_interfaceid": "2cbbae14-e717-4433-83e2-4d64d89ccdb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1301.390968] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Releasing lock "refresh_cache-7a82e877-8a39-4684-8b75-711b7bedddac" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1301.391288] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Instance network_info: |[{"id": "2cbbae14-e717-4433-83e2-4d64d89ccdb5", "address": "fa:16:3e:5d:a0:f5", "network": {"id": "759168f7-c8c8-49ca-bdbd-a3883250a6e2", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2046370020-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "320b366df7f94569bb729c10982ebd90", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7ab8d568-adb0-4f3b-b6cc-68413e6546ae", "external-id": "nsx-vlan-transportzone-86", "segmentation_id": 86, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2cbbae14-e7", "ovs_interfaceid": "2cbbae14-e717-4433-83e2-4d64d89ccdb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1301.391676] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5d:a0:f5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7ab8d568-adb0-4f3b-b6cc-68413e6546ae', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2cbbae14-e717-4433-83e2-4d64d89ccdb5', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1301.399571] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Creating folder: Project (320b366df7f94569bb729c10982ebd90). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1301.400129] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1bd8f641-6776-4c51-8dcf-6165d823da0f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.411424] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Created folder: Project (320b366df7f94569bb729c10982ebd90) in parent group-v692900. [ 1301.411674] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Creating folder: Instances. Parent ref: group-v692976. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1301.411834] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bc65acff-3832-4a74-b6e1-4718056d8a48 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.420364] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Created folder: Instances in parent group-v692976. [ 1301.420548] env[67899]: DEBUG oslo.service.loopingcall [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1301.420729] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1301.420918] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b516ed77-85e8-4725-96a9-93a4eae91e1c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.440952] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1301.440952] env[67899]: value = "task-3467945" [ 1301.440952] env[67899]: _type = "Task" [ 1301.440952] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1301.448134] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467945, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1301.950215] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467945, 'name': CreateVM_Task, 'duration_secs': 0.286878} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1301.950398] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1301.951066] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1301.951236] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1301.951543] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1301.951794] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-70963d51-74f7-4518-9f44-2b90f5fada09 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.955974] env[67899]: DEBUG oslo_vmware.api [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Waiting for the task: (returnval){ [ 1301.955974] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5220deb9-3736-ac0d-6b07-10fb9e9d3131" [ 1301.955974] env[67899]: _type = "Task" [ 1301.955974] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1301.963123] env[67899]: DEBUG oslo_vmware.api [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5220deb9-3736-ac0d-6b07-10fb9e9d3131, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1302.466464] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1302.466733] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1302.466918] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1303.066501] env[67899]: DEBUG nova.compute.manager [req-2a324b23-f76f-4a13-8b88-1ba224fd887c req-21a3829b-a9b8-46ed-beb8-f3f32bc2112b service nova] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Received event network-changed-2cbbae14-e717-4433-83e2-4d64d89ccdb5 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1303.066711] env[67899]: DEBUG nova.compute.manager [req-2a324b23-f76f-4a13-8b88-1ba224fd887c req-21a3829b-a9b8-46ed-beb8-f3f32bc2112b service nova] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Refreshing instance network info cache due to event network-changed-2cbbae14-e717-4433-83e2-4d64d89ccdb5. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1303.066923] env[67899]: DEBUG oslo_concurrency.lockutils [req-2a324b23-f76f-4a13-8b88-1ba224fd887c req-21a3829b-a9b8-46ed-beb8-f3f32bc2112b service nova] Acquiring lock "refresh_cache-7a82e877-8a39-4684-8b75-711b7bedddac" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1303.067077] env[67899]: DEBUG oslo_concurrency.lockutils [req-2a324b23-f76f-4a13-8b88-1ba224fd887c req-21a3829b-a9b8-46ed-beb8-f3f32bc2112b service nova] Acquired lock "refresh_cache-7a82e877-8a39-4684-8b75-711b7bedddac" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1303.067446] env[67899]: DEBUG nova.network.neutron [req-2a324b23-f76f-4a13-8b88-1ba224fd887c req-21a3829b-a9b8-46ed-beb8-f3f32bc2112b service nova] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Refreshing network info cache for port 2cbbae14-e717-4433-83e2-4d64d89ccdb5 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1303.382779] env[67899]: DEBUG nova.network.neutron [req-2a324b23-f76f-4a13-8b88-1ba224fd887c req-21a3829b-a9b8-46ed-beb8-f3f32bc2112b service nova] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Updated VIF entry in instance network info cache for port 2cbbae14-e717-4433-83e2-4d64d89ccdb5. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1303.383176] env[67899]: DEBUG nova.network.neutron [req-2a324b23-f76f-4a13-8b88-1ba224fd887c req-21a3829b-a9b8-46ed-beb8-f3f32bc2112b service nova] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Updating instance_info_cache with network_info: [{"id": "2cbbae14-e717-4433-83e2-4d64d89ccdb5", "address": "fa:16:3e:5d:a0:f5", "network": {"id": "759168f7-c8c8-49ca-bdbd-a3883250a6e2", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2046370020-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "320b366df7f94569bb729c10982ebd90", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7ab8d568-adb0-4f3b-b6cc-68413e6546ae", "external-id": "nsx-vlan-transportzone-86", "segmentation_id": 86, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2cbbae14-e7", "ovs_interfaceid": "2cbbae14-e717-4433-83e2-4d64d89ccdb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1303.393375] env[67899]: DEBUG oslo_concurrency.lockutils [req-2a324b23-f76f-4a13-8b88-1ba224fd887c req-21a3829b-a9b8-46ed-beb8-f3f32bc2112b service nova] Releasing lock "refresh_cache-7a82e877-8a39-4684-8b75-711b7bedddac" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1340.619062] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1341.996500] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1341.996777] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1341.996834] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1342.019735] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.019892] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.020031] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.020162] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.020285] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.020419] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.020538] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.020654] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.020771] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.020885] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1342.021009] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1342.996008] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1342.996314] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1343.997932] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1343.998300] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1345.996048] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1345.996048] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1346.347079] env[67899]: WARNING oslo_vmware.rw_handles [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1346.347079] env[67899]: ERROR oslo_vmware.rw_handles [ 1346.347510] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/1dbb10db-09fc-4410-ace1-9cd4be0517d6/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1346.349970] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1346.350289] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Copying Virtual Disk [datastore1] vmware_temp/1dbb10db-09fc-4410-ace1-9cd4be0517d6/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/1dbb10db-09fc-4410-ace1-9cd4be0517d6/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1346.350605] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a14cb3f6-5d8c-495b-819b-b402d1a344f5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.358764] env[67899]: DEBUG oslo_vmware.api [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Waiting for the task: (returnval){ [ 1346.358764] env[67899]: value = "task-3467946" [ 1346.358764] env[67899]: _type = "Task" [ 1346.358764] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1346.366289] env[67899]: DEBUG oslo_vmware.api [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Task: {'id': task-3467946, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1346.869318] env[67899]: DEBUG oslo_vmware.exceptions [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1346.870692] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1346.870692] env[67899]: ERROR nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1346.870692] env[67899]: Faults: ['InvalidArgument'] [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Traceback (most recent call last): [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] yield resources [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] self.driver.spawn(context, instance, image_meta, [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] self._fetch_image_if_missing(context, vi) [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] image_cache(vi, tmp_image_ds_loc) [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] vm_util.copy_virtual_disk( [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] session._wait_for_task(vmdk_copy_task) [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] return self.wait_for_task(task_ref) [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] return evt.wait() [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] result = hub.switch() [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] return self.greenlet.switch() [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] self.f(*self.args, **self.kw) [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] raise exceptions.translate_fault(task_info.error) [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Faults: ['InvalidArgument'] [ 1346.870692] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] [ 1346.870692] env[67899]: INFO nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Terminating instance [ 1346.872196] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1346.872397] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1346.873015] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1346.873313] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1346.873437] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8da779d9-8667-479b-be1e-c8f777021806 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.875592] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cba56059-eb32-4381-92cf-350f6c33d173 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.883376] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1346.883582] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2441d8dc-3498-445a-b19e-f4ab9d3a24c7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.885746] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1346.885926] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1346.886840] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4f22dfd3-f6af-499a-8dc4-6038fd15a07c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.891769] env[67899]: DEBUG oslo_vmware.api [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Waiting for the task: (returnval){ [ 1346.891769] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52a0d065-439c-62e5-80be-9410fbdf560f" [ 1346.891769] env[67899]: _type = "Task" [ 1346.891769] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1346.905894] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1346.906131] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Creating directory with path [datastore1] vmware_temp/13f0e2ab-1eda-4a9e-9f24-748bb61a6111/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1346.906338] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-472c58af-c00a-47c4-8b6c-acb2d46f0533 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.916606] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Created directory with path [datastore1] vmware_temp/13f0e2ab-1eda-4a9e-9f24-748bb61a6111/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1346.916791] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Fetch image to [datastore1] vmware_temp/13f0e2ab-1eda-4a9e-9f24-748bb61a6111/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1346.916964] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/13f0e2ab-1eda-4a9e-9f24-748bb61a6111/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1346.917715] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01eaa2d7-a175-4579-9be6-45b5cf6407df {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.924238] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4b46dae-16df-46fb-a386-c61d556f6ba9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.933126] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7d159a4-df28-4a1e-a9dd-1a16c20c2bf7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.964919] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30192811-5452-4f9d-9136-3d9864b382ff {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.967719] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1346.967925] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1346.968115] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Deleting the datastore file [datastore1] 8d2a9e20-82d3-44cf-a725-59804debe1cc {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1346.968353] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f8cff425-2e27-4f11-af81-6158efd5e24b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.973832] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-06f895d8-e266-4487-aa63-18a8b27d2282 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1346.975495] env[67899]: DEBUG oslo_vmware.api [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Waiting for the task: (returnval){ [ 1346.975495] env[67899]: value = "task-3467948" [ 1346.975495] env[67899]: _type = "Task" [ 1346.975495] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1346.983144] env[67899]: DEBUG oslo_vmware.api [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Task: {'id': task-3467948, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1346.996699] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1347.001823] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1347.160521] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1347.162134] env[67899]: ERROR nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Traceback (most recent call last): [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] result = getattr(controller, method)(*args, **kwargs) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self._get(image_id) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] resp, body = self.http_client.get(url, headers=header) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self.request(url, 'GET', **kwargs) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self._handle_response(resp) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] raise exc.from_response(resp, resp.content) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] During handling of the above exception, another exception occurred: [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Traceback (most recent call last): [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] yield resources [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self.driver.spawn(context, instance, image_meta, [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self._fetch_image_if_missing(context, vi) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] image_fetch(context, vi, tmp_image_ds_loc) [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] images.fetch_image( [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1347.162134] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] metadata = IMAGE_API.get(context, image_ref) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return session.show(context, image_id, [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] _reraise_translated_image_exception(image_id) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] raise new_exc.with_traceback(exc_trace) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] result = getattr(controller, method)(*args, **kwargs) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self._get(image_id) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] resp, body = self.http_client.get(url, headers=header) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self.request(url, 'GET', **kwargs) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self._handle_response(resp) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] raise exc.from_response(resp, resp.content) [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] nova.exception.ImageNotAuthorized: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1347.163219] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] [ 1347.163219] env[67899]: INFO nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Terminating instance [ 1347.164624] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1347.164624] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1347.164835] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "refresh_cache-bb97988e-9f7f-4e4f-9904-fc560d0912ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1347.165232] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquired lock "refresh_cache-bb97988e-9f7f-4e4f-9904-fc560d0912ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1347.165232] env[67899]: DEBUG nova.network.neutron [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1347.166203] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-777d5f02-3244-4140-8d26-ecda4880cf8f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.176896] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1347.177250] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1347.178090] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1853f372-e718-4fba-99ab-3a2d216aa2c7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.183242] env[67899]: DEBUG oslo_vmware.api [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Waiting for the task: (returnval){ [ 1347.183242] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52fdd4b6-6953-4d59-a481-55623e08c840" [ 1347.183242] env[67899]: _type = "Task" [ 1347.183242] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1347.191013] env[67899]: DEBUG oslo_vmware.api [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52fdd4b6-6953-4d59-a481-55623e08c840, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1347.194757] env[67899]: DEBUG nova.network.neutron [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1347.261654] env[67899]: DEBUG nova.network.neutron [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1347.270582] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Releasing lock "refresh_cache-bb97988e-9f7f-4e4f-9904-fc560d0912ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1347.270963] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1347.271169] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1347.272255] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e750a8cc-8f6c-4c2f-9732-2a80a6c14056 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.281702] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1347.281929] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-51dff6df-fed7-4b90-be6f-e844149b950a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.305368] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1347.305582] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1347.305756] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Deleting the datastore file [datastore1] bb97988e-9f7f-4e4f-9904-fc560d0912ee {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1347.306280] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-20205746-7d65-457f-91ff-0a1ffb821760 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.312652] env[67899]: DEBUG oslo_vmware.api [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Waiting for the task: (returnval){ [ 1347.312652] env[67899]: value = "task-3467950" [ 1347.312652] env[67899]: _type = "Task" [ 1347.312652] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1347.320184] env[67899]: DEBUG oslo_vmware.api [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Task: {'id': task-3467950, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1347.485327] env[67899]: DEBUG oslo_vmware.api [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Task: {'id': task-3467948, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068035} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1347.485621] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1347.485869] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1347.486130] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1347.486323] env[67899]: INFO nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1347.488506] env[67899]: DEBUG nova.compute.claims [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1347.488692] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1347.488913] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1347.695097] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1347.695352] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Creating directory with path [datastore1] vmware_temp/345c15f9-e2f1-437f-95b5-1cd96c9b4aab/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1347.695646] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-108fb4c7-2fbd-4fa0-9326-45c250db9e01 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.707293] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Created directory with path [datastore1] vmware_temp/345c15f9-e2f1-437f-95b5-1cd96c9b4aab/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1347.707496] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Fetch image to [datastore1] vmware_temp/345c15f9-e2f1-437f-95b5-1cd96c9b4aab/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1347.707712] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/345c15f9-e2f1-437f-95b5-1cd96c9b4aab/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1347.708481] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40ccd5e8-0c40-49fe-a5a2-54530c357ddf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.718060] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dc6f2f9-9732-4109-9ac5-fd9e555dbaa6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.729354] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06902dae-986a-4243-826f-76a9adfbf20c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.762246] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc6111d0-a7e0-4fc4-bf3b-0a5c0f61b782 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.768113] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-91ee4c5f-7993-4e8f-9ffd-3a9bf943c521 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.786687] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6173f05f-2c04-49c5-8141-271c3069a2c4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.792553] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1347.798251] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01050b34-c847-441e-8761-dee8bb227708 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.831812] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1e41e57-411e-406a-91f6-a1fd246a54dd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.838539] env[67899]: DEBUG oslo_vmware.api [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Task: {'id': task-3467950, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.030957} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1347.844053] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1347.844053] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1347.844053] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1347.844053] env[67899]: INFO nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Took 0.57 seconds to destroy the instance on the hypervisor. [ 1347.844236] env[67899]: DEBUG oslo.service.loopingcall [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1347.844523] env[67899]: DEBUG nova.compute.manager [-] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network deallocation for instance since networking was not requested. {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1347.845679] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-024f426b-5957-4d0e-94f5-3666bec81ecb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.851149] env[67899]: DEBUG nova.compute.claims [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1347.851338] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1347.859944] env[67899]: DEBUG nova.compute.provider_tree [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1347.869064] env[67899]: DEBUG nova.scheduler.client.report [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1347.876173] env[67899]: DEBUG oslo_vmware.rw_handles [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/345c15f9-e2f1-437f-95b5-1cd96c9b4aab/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1347.932225] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.443s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1347.932782] env[67899]: ERROR nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1347.932782] env[67899]: Faults: ['InvalidArgument'] [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Traceback (most recent call last): [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] self.driver.spawn(context, instance, image_meta, [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] self._fetch_image_if_missing(context, vi) [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] image_cache(vi, tmp_image_ds_loc) [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] vm_util.copy_virtual_disk( [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] session._wait_for_task(vmdk_copy_task) [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] return self.wait_for_task(task_ref) [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] return evt.wait() [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] result = hub.switch() [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] return self.greenlet.switch() [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] self.f(*self.args, **self.kw) [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] raise exceptions.translate_fault(task_info.error) [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Faults: ['InvalidArgument'] [ 1347.932782] env[67899]: ERROR nova.compute.manager [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] [ 1347.933666] env[67899]: DEBUG nova.compute.utils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1347.935670] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.084s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1347.938489] env[67899]: DEBUG oslo_vmware.rw_handles [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1347.938661] env[67899]: DEBUG oslo_vmware.rw_handles [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/345c15f9-e2f1-437f-95b5-1cd96c9b4aab/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1347.939144] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Build of instance 8d2a9e20-82d3-44cf-a725-59804debe1cc was re-scheduled: A specified parameter was not correct: fileType [ 1347.939144] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1347.939524] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1347.939690] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1347.939856] env[67899]: DEBUG nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1347.940034] env[67899]: DEBUG nova.network.neutron [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1348.211824] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1274c7e-417b-4849-a6a7-6579f6bfa18b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.219854] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0531e9c8-7e21-4e5d-80cc-7fa535a90828 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.251505] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68bccdd8-1f3b-43d8-be21-d2b763dfb7ed {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.260104] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37bd3094-939c-47fe-a83b-3380fc7358b4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.273520] env[67899]: DEBUG nova.compute.provider_tree [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1348.286867] env[67899]: DEBUG nova.scheduler.client.report [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1348.289402] env[67899]: DEBUG nova.network.neutron [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1348.301155] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.364s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.301155] env[67899]: ERROR nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Traceback (most recent call last): [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] result = getattr(controller, method)(*args, **kwargs) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self._get(image_id) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] resp, body = self.http_client.get(url, headers=header) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self.request(url, 'GET', **kwargs) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self._handle_response(resp) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] raise exc.from_response(resp, resp.content) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] During handling of the above exception, another exception occurred: [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Traceback (most recent call last): [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self.driver.spawn(context, instance, image_meta, [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self._fetch_image_if_missing(context, vi) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] image_fetch(context, vi, tmp_image_ds_loc) [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] images.fetch_image( [ 1348.301155] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] metadata = IMAGE_API.get(context, image_ref) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return session.show(context, image_id, [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] _reraise_translated_image_exception(image_id) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] raise new_exc.with_traceback(exc_trace) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] result = getattr(controller, method)(*args, **kwargs) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self._get(image_id) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] resp, body = self.http_client.get(url, headers=header) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self.request(url, 'GET', **kwargs) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self._handle_response(resp) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] raise exc.from_response(resp, resp.content) [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] nova.exception.ImageNotAuthorized: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1348.302121] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] [ 1348.302121] env[67899]: DEBUG nova.compute.utils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1348.302914] env[67899]: INFO nova.compute.manager [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Took 0.36 seconds to deallocate network for instance. [ 1348.305366] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Build of instance bb97988e-9f7f-4e4f-9904-fc560d0912ee was re-scheduled: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1348.305838] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1348.306073] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "refresh_cache-bb97988e-9f7f-4e4f-9904-fc560d0912ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1348.306224] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquired lock "refresh_cache-bb97988e-9f7f-4e4f-9904-fc560d0912ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1348.306384] env[67899]: DEBUG nova.network.neutron [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1348.341897] env[67899]: DEBUG nova.network.neutron [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1348.388466] env[67899]: INFO nova.scheduler.client.report [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Deleted allocations for instance 8d2a9e20-82d3-44cf-a725-59804debe1cc [ 1348.408129] env[67899]: DEBUG nova.network.neutron [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1348.414722] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7b088029-0146-4c76-a894-7f015ede2f7e tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 681.295s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.415775] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 483.018s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1348.415999] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Acquiring lock "8d2a9e20-82d3-44cf-a725-59804debe1cc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1348.416215] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1348.416381] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.418348] env[67899]: INFO nova.compute.manager [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Terminating instance [ 1348.420140] env[67899]: DEBUG nova.compute.manager [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1348.420337] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1348.420959] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8d8160e5-dfb4-469b-93f1-5241183b5578 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.425379] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1348.427850] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Releasing lock "refresh_cache-bb97988e-9f7f-4e4f-9904-fc560d0912ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1348.428084] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1348.428294] env[67899]: DEBUG nova.compute.manager [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Skipping network deallocation for instance since networking was not requested. {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1348.434858] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42cdc59c-4e84-40a4-808e-2806c6135810 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.468166] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8d2a9e20-82d3-44cf-a725-59804debe1cc could not be found. [ 1348.468425] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1348.468561] env[67899]: INFO nova.compute.manager [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1348.468793] env[67899]: DEBUG oslo.service.loopingcall [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1348.474090] env[67899]: DEBUG nova.compute.manager [-] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1348.474218] env[67899]: DEBUG nova.network.neutron [-] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1348.491991] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1348.492277] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1348.493755] env[67899]: INFO nova.compute.claims [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1348.519377] env[67899]: DEBUG nova.network.neutron [-] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1348.536414] env[67899]: INFO nova.scheduler.client.report [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Deleted allocations for instance bb97988e-9f7f-4e4f-9904-fc560d0912ee [ 1348.563674] env[67899]: INFO nova.compute.manager [-] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] Took 0.09 seconds to deallocate network for instance. [ 1348.581591] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e7f4b054-51ad-4989-bc09-ccbf6811411e tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 629.789s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.582601] env[67899]: DEBUG oslo_concurrency.lockutils [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.057s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1348.582808] env[67899]: DEBUG oslo_concurrency.lockutils [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1348.583012] env[67899]: DEBUG oslo_concurrency.lockutils [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1348.583185] env[67899]: DEBUG oslo_concurrency.lockutils [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.584812] env[67899]: INFO nova.compute.manager [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Terminating instance [ 1348.586508] env[67899]: DEBUG oslo_concurrency.lockutils [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquiring lock "refresh_cache-bb97988e-9f7f-4e4f-9904-fc560d0912ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1348.586848] env[67899]: DEBUG oslo_concurrency.lockutils [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Acquired lock "refresh_cache-bb97988e-9f7f-4e4f-9904-fc560d0912ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1348.586887] env[67899]: DEBUG nova.network.neutron [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1348.592277] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1348.615232] env[67899]: DEBUG nova.network.neutron [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1348.645231] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1348.660938] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6dd98723-db00-4d66-a748-819e3c251c34 tempest-ServersTestFqdnHostnames-61628670 tempest-ServersTestFqdnHostnames-61628670-project-member] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.245s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.661934] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 173.642s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1348.662044] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8d2a9e20-82d3-44cf-a725-59804debe1cc] During sync_power_state the instance has a pending task (deleting). Skip. [ 1348.662221] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "8d2a9e20-82d3-44cf-a725-59804debe1cc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.693107] env[67899]: DEBUG nova.network.neutron [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1348.701350] env[67899]: DEBUG oslo_concurrency.lockutils [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Releasing lock "refresh_cache-bb97988e-9f7f-4e4f-9904-fc560d0912ee" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1348.701721] env[67899]: DEBUG nova.compute.manager [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1348.701911] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1348.702445] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-648aa428-5642-4e5f-ad22-98da7503f92f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.711114] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d80d6923-f936-4ed0-9494-cbc08606da9d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.743230] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bb97988e-9f7f-4e4f-9904-fc560d0912ee could not be found. [ 1348.743438] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1348.743612] env[67899]: INFO nova.compute.manager [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1348.744129] env[67899]: DEBUG oslo.service.loopingcall [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1348.746042] env[67899]: DEBUG nova.compute.manager [-] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1348.746149] env[67899]: DEBUG nova.network.neutron [-] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1348.806758] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b063cd52-9699-4ce6-8e55-69761f6c6d5d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.814317] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-627acbc4-78e3-4768-b4ff-e316a3a84848 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.850032] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86506c24-7e7d-422e-8d81-ac79d90cb878 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.857541] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95279b4e-e1a3-4add-a72f-dff74334479e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.870625] env[67899]: DEBUG nova.compute.provider_tree [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1348.879530] env[67899]: DEBUG nova.scheduler.client.report [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1348.897460] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.405s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.897460] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1348.899743] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.255s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1348.901102] env[67899]: INFO nova.compute.claims [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1348.904810] env[67899]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67899) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1348.905065] env[67899]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-7070ecd8-d8f6-4b57-b5f1-c73fe8a5bdf4'] [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1348.906776] env[67899]: ERROR oslo.service.loopingcall [ 1348.908094] env[67899]: ERROR nova.compute.manager [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1348.947840] env[67899]: DEBUG nova.compute.utils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1348.949633] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1348.949801] env[67899]: DEBUG nova.network.neutron [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1348.953221] env[67899]: ERROR nova.compute.manager [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Traceback (most recent call last): [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] ret = obj(*args, **kwargs) [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] exception_handler_v20(status_code, error_body) [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] raise client_exc(message=error_message, [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Neutron server returns request_ids: ['req-7070ecd8-d8f6-4b57-b5f1-c73fe8a5bdf4'] [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] During handling of the above exception, another exception occurred: [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Traceback (most recent call last): [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self._delete_instance(context, instance, bdms) [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self._shutdown_instance(context, instance, bdms) [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self._try_deallocate_network(context, instance, requested_networks) [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] with excutils.save_and_reraise_exception(): [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self.force_reraise() [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] raise self.value [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] _deallocate_network_with_retries() [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return evt.wait() [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] result = hub.switch() [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self.greenlet.switch() [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] result = func(*self.args, **self.kw) [ 1348.953221] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] result = f(*args, **kwargs) [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self._deallocate_network( [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self.network_api.deallocate_for_instance( [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] data = neutron.list_ports(**search_opts) [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] ret = obj(*args, **kwargs) [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self.list('ports', self.ports_path, retrieve_all, [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] ret = obj(*args, **kwargs) [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] for r in self._pagination(collection, path, **params): [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] res = self.get(path, params=params) [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] ret = obj(*args, **kwargs) [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self.retry_request("GET", action, body=body, [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] ret = obj(*args, **kwargs) [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] return self.do_request(method, action, body=body, [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] ret = obj(*args, **kwargs) [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] self._handle_fault_response(status_code, replybody, resp) [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1348.954078] env[67899]: ERROR nova.compute.manager [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] [ 1348.958263] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1348.981365] env[67899]: DEBUG oslo_concurrency.lockutils [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.399s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.984685] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 173.965s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1348.984890] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] During sync_power_state the instance has a pending task (deleting). Skip. [ 1348.985100] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "bb97988e-9f7f-4e4f-9904-fc560d0912ee" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1348.996916] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1349.010942] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1349.012549] env[67899]: DEBUG nova.policy [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'df761ae5fb8c4652a1c7ee3042d6762c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7dc2733ec0f2400791b974bfe444aa6e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1349.025578] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1349.033766] env[67899]: INFO nova.compute.manager [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] [instance: bb97988e-9f7f-4e4f-9904-fc560d0912ee] Successfully reverted task state from None on failure for instance. [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server [None req-22229d4a-dfdd-42c9-a4ce-d252d1cd204b tempest-ServerShowV257Test-1183244838 tempest-ServerShowV257Test-1183244838-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-7070ecd8-d8f6-4b57-b5f1-c73fe8a5bdf4'] [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1349.040229] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1349.041421] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1349.042671] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1349.042671] env[67899]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1349.042671] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1349.042671] env[67899]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1349.042671] env[67899]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1349.042671] env[67899]: ERROR oslo_messaging.rpc.server [ 1349.052198] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1349.052417] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1349.052567] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1349.052750] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1349.052891] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1349.055016] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1349.055016] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1349.055016] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1349.055016] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1349.055016] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1349.055016] env[67899]: DEBUG nova.virt.hardware [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1349.055016] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17f9469f-af23-4064-87b3-777318d6e5b3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.062729] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d92a5ac1-c811-4c65-b6f0-fb71178966d1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.188192] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da86e76d-b9e0-4dca-a2c3-6c41edff6b54 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.195678] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3526db0-9fe6-421f-b421-c4cc08390d78 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.224983] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0dc9c57-a5f8-4330-a93e-b252d2870222 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.231895] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05b267ea-75f3-48bc-b5a7-77036c1c1f42 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.244589] env[67899]: DEBUG nova.compute.provider_tree [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1349.255239] env[67899]: DEBUG nova.scheduler.client.report [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1349.269925] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1349.270408] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1349.272526] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.262s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1349.272709] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1349.272861] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1349.273957] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1a24eea-f385-4d2c-8b99-70dd83461752 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.282229] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2ae2689-3975-4253-831e-31620d34cbf7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.297599] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfcd53d7-d09b-4df1-b3db-45d52985492c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.303178] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4ea5fde-769b-46ef-9292-533a52d27882 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.336256] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180915MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1349.336372] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1349.336556] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1349.338949] env[67899]: DEBUG nova.compute.utils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1349.343135] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1349.343135] env[67899]: DEBUG nova.network.neutron [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1349.349831] env[67899]: DEBUG nova.network.neutron [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Successfully created port: c83bd0c3-48a4-468d-b5f7-5245a4aacb1b {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1349.355881] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1349.444282] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.444457] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.444587] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.444956] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.449019] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.449019] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.449019] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.449019] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a82e877-8a39-4684-8b75-711b7bedddac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.449019] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.449019] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8a157747-34e2-48f7-bf21-d17810122954 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1349.465288] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1349.467105] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1349.477479] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 49c65e6c-9e16-40f5-9754-fe81681f9714 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1349.492078] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ac2f9cf9-f573-4f21-aeb4-6cea5c94f843 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1349.496598] env[67899]: DEBUG nova.policy [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '063b560a20d547b7851193ba932afab7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '53462a4c49974da5b32f4498b906da6e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1349.500516] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:19:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='56799a09-6f4d-4c07-a020-1bfe595ef26e',id=38,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-2022420155',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1349.503017] env[67899]: DEBUG nova.virt.hardware [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1349.503802] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-005752c3-aca0-4ba6-9aba-85906fee84e0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.506684] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5bb22bfa-4f1f-42a8-a7e3-5e806c70ae45 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1349.513488] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa975d8f-e740-46d0-b75b-3b636b85cbce {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.518017] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1349.529596] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 53a54716-a3cd-4234-977d-0c82370025d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1349.541431] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5d8b3009-ba5e-4f29-81fc-6d389ec30808 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1349.553367] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a292a68e-deff-465b-81f0-727e75c2e212 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1349.553596] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1349.553742] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1349.796176] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-047ebe02-8337-47f5-aacc-89e37e2b7d77 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.805452] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80688f28-19c9-439b-ae9c-82c33c3b3800 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.835346] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6256d2df-2ded-4efe-af21-91c4e66eb796 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.842939] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33eef6c8-5ad8-4249-8a85-c8e94af0eae2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.855887] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1349.864472] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1349.879858] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1349.880073] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.398233] env[67899]: DEBUG nova.network.neutron [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Successfully created port: 6708207f-73db-4713-91a7-c36ad324bda1 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1350.515931] env[67899]: DEBUG nova.compute.manager [req-7bf63c25-bcd9-4cdc-9cf5-e3af792c16fb req-b1b77af5-eaba-4dec-b8bb-088a7679f832 service nova] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Received event network-vif-plugged-c83bd0c3-48a4-468d-b5f7-5245a4aacb1b {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1350.516159] env[67899]: DEBUG oslo_concurrency.lockutils [req-7bf63c25-bcd9-4cdc-9cf5-e3af792c16fb req-b1b77af5-eaba-4dec-b8bb-088a7679f832 service nova] Acquiring lock "dc7bf2b7-631d-4933-92db-1679ad823379-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1350.516367] env[67899]: DEBUG oslo_concurrency.lockutils [req-7bf63c25-bcd9-4cdc-9cf5-e3af792c16fb req-b1b77af5-eaba-4dec-b8bb-088a7679f832 service nova] Lock "dc7bf2b7-631d-4933-92db-1679ad823379-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1350.516537] env[67899]: DEBUG oslo_concurrency.lockutils [req-7bf63c25-bcd9-4cdc-9cf5-e3af792c16fb req-b1b77af5-eaba-4dec-b8bb-088a7679f832 service nova] Lock "dc7bf2b7-631d-4933-92db-1679ad823379-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.516699] env[67899]: DEBUG nova.compute.manager [req-7bf63c25-bcd9-4cdc-9cf5-e3af792c16fb req-b1b77af5-eaba-4dec-b8bb-088a7679f832 service nova] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] No waiting events found dispatching network-vif-plugged-c83bd0c3-48a4-468d-b5f7-5245a4aacb1b {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1350.516929] env[67899]: WARNING nova.compute.manager [req-7bf63c25-bcd9-4cdc-9cf5-e3af792c16fb req-b1b77af5-eaba-4dec-b8bb-088a7679f832 service nova] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Received unexpected event network-vif-plugged-c83bd0c3-48a4-468d-b5f7-5245a4aacb1b for instance with vm_state building and task_state spawning. [ 1350.687582] env[67899]: DEBUG nova.network.neutron [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Successfully updated port: c83bd0c3-48a4-468d-b5f7-5245a4aacb1b {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1350.702037] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquiring lock "refresh_cache-dc7bf2b7-631d-4933-92db-1679ad823379" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1350.702197] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquired lock "refresh_cache-dc7bf2b7-631d-4933-92db-1679ad823379" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1350.702344] env[67899]: DEBUG nova.network.neutron [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1350.760919] env[67899]: DEBUG nova.network.neutron [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1351.032763] env[67899]: DEBUG nova.network.neutron [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Updating instance_info_cache with network_info: [{"id": "c83bd0c3-48a4-468d-b5f7-5245a4aacb1b", "address": "fa:16:3e:dc:a9:17", "network": {"id": "c61a78cd-a57a-4516-bc5e-a3d1d7e8a093", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1825199359-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7dc2733ec0f2400791b974bfe444aa6e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc83bd0c3-48", "ovs_interfaceid": "c83bd0c3-48a4-468d-b5f7-5245a4aacb1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1351.044541] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Releasing lock "refresh_cache-dc7bf2b7-631d-4933-92db-1679ad823379" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1351.044854] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Instance network_info: |[{"id": "c83bd0c3-48a4-468d-b5f7-5245a4aacb1b", "address": "fa:16:3e:dc:a9:17", "network": {"id": "c61a78cd-a57a-4516-bc5e-a3d1d7e8a093", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1825199359-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7dc2733ec0f2400791b974bfe444aa6e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc83bd0c3-48", "ovs_interfaceid": "c83bd0c3-48a4-468d-b5f7-5245a4aacb1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1351.045253] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:dc:a9:17', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8e272539-d425-489f-9a63-aba692e88933', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c83bd0c3-48a4-468d-b5f7-5245a4aacb1b', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1351.052970] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Creating folder: Project (7dc2733ec0f2400791b974bfe444aa6e). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1351.053486] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ffaedca3-ed07-42df-b57c-34e996d97fae {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.064618] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Created folder: Project (7dc2733ec0f2400791b974bfe444aa6e) in parent group-v692900. [ 1351.064746] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Creating folder: Instances. Parent ref: group-v692979. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1351.064969] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1c0ef889-ba74-4253-927f-75348f87c174 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.073808] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Created folder: Instances in parent group-v692979. [ 1351.076301] env[67899]: DEBUG oslo.service.loopingcall [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1351.076301] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1351.076301] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1ee8faca-90c5-49c9-84f8-46fe2962a8e3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.096403] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1351.096403] env[67899]: value = "task-3467953" [ 1351.096403] env[67899]: _type = "Task" [ 1351.096403] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1351.103969] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467953, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1351.397257] env[67899]: DEBUG nova.network.neutron [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Successfully updated port: 6708207f-73db-4713-91a7-c36ad324bda1 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1351.405589] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquiring lock "refresh_cache-8a157747-34e2-48f7-bf21-d17810122954" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1351.405830] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquired lock "refresh_cache-8a157747-34e2-48f7-bf21-d17810122954" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1351.405867] env[67899]: DEBUG nova.network.neutron [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1351.471958] env[67899]: DEBUG nova.network.neutron [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1351.605455] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467953, 'name': CreateVM_Task} progress is 25%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1351.630895] env[67899]: DEBUG nova.network.neutron [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Updating instance_info_cache with network_info: [{"id": "6708207f-73db-4713-91a7-c36ad324bda1", "address": "fa:16:3e:5d:63:6e", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6708207f-73", "ovs_interfaceid": "6708207f-73db-4713-91a7-c36ad324bda1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1351.645625] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Releasing lock "refresh_cache-8a157747-34e2-48f7-bf21-d17810122954" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1351.645931] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Instance network_info: |[{"id": "6708207f-73db-4713-91a7-c36ad324bda1", "address": "fa:16:3e:5d:63:6e", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6708207f-73", "ovs_interfaceid": "6708207f-73db-4713-91a7-c36ad324bda1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1351.646318] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5d:63:6e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6708207f-73db-4713-91a7-c36ad324bda1', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1351.653450] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Creating folder: Project (53462a4c49974da5b32f4498b906da6e). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1351.653940] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3962b79c-89ec-4857-9673-2556301f8a21 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.664823] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Created folder: Project (53462a4c49974da5b32f4498b906da6e) in parent group-v692900. [ 1351.665007] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Creating folder: Instances. Parent ref: group-v692982. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1351.665221] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c4517b5e-8aaa-4305-8d23-3f12c8e2af09 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.674585] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Created folder: Instances in parent group-v692982. [ 1351.674800] env[67899]: DEBUG oslo.service.loopingcall [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1351.674976] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1351.675171] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b3a04435-d8c0-4692-9611-bd3b237929d5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.692580] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1351.692580] env[67899]: value = "task-3467956" [ 1351.692580] env[67899]: _type = "Task" [ 1351.692580] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1351.699785] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467956, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1352.107903] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467953, 'name': CreateVM_Task} progress is 25%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1352.203020] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467956, 'name': CreateVM_Task, 'duration_secs': 0.326471} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1352.203229] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1352.203879] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1352.204057] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1352.204368] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1352.204613] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-73fd1a59-846c-4234-93ef-481fbfa5e896 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1352.209137] env[67899]: DEBUG oslo_vmware.api [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Waiting for the task: (returnval){ [ 1352.209137] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]525aa4df-56f1-4027-36e7-5e433872fae2" [ 1352.209137] env[67899]: _type = "Task" [ 1352.209137] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1352.216440] env[67899]: DEBUG oslo_vmware.api [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]525aa4df-56f1-4027-36e7-5e433872fae2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1352.548617] env[67899]: DEBUG nova.compute.manager [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Received event network-changed-c83bd0c3-48a4-468d-b5f7-5245a4aacb1b {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1352.548881] env[67899]: DEBUG nova.compute.manager [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Refreshing instance network info cache due to event network-changed-c83bd0c3-48a4-468d-b5f7-5245a4aacb1b. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1352.549075] env[67899]: DEBUG oslo_concurrency.lockutils [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] Acquiring lock "refresh_cache-dc7bf2b7-631d-4933-92db-1679ad823379" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1352.549075] env[67899]: DEBUG oslo_concurrency.lockutils [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] Acquired lock "refresh_cache-dc7bf2b7-631d-4933-92db-1679ad823379" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1352.549211] env[67899]: DEBUG nova.network.neutron [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Refreshing network info cache for port c83bd0c3-48a4-468d-b5f7-5245a4aacb1b {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1352.608951] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467953, 'name': CreateVM_Task} progress is 25%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1352.721352] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1352.721597] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1352.721827] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1352.810452] env[67899]: DEBUG nova.network.neutron [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Updated VIF entry in instance network info cache for port c83bd0c3-48a4-468d-b5f7-5245a4aacb1b. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1352.810806] env[67899]: DEBUG nova.network.neutron [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Updating instance_info_cache with network_info: [{"id": "c83bd0c3-48a4-468d-b5f7-5245a4aacb1b", "address": "fa:16:3e:dc:a9:17", "network": {"id": "c61a78cd-a57a-4516-bc5e-a3d1d7e8a093", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1825199359-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7dc2733ec0f2400791b974bfe444aa6e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e272539-d425-489f-9a63-aba692e88933", "external-id": "nsx-vlan-transportzone-869", "segmentation_id": 869, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc83bd0c3-48", "ovs_interfaceid": "c83bd0c3-48a4-468d-b5f7-5245a4aacb1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1352.820207] env[67899]: DEBUG oslo_concurrency.lockutils [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] Releasing lock "refresh_cache-dc7bf2b7-631d-4933-92db-1679ad823379" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1352.820450] env[67899]: DEBUG nova.compute.manager [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Received event network-vif-plugged-6708207f-73db-4713-91a7-c36ad324bda1 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1352.820644] env[67899]: DEBUG oslo_concurrency.lockutils [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] Acquiring lock "8a157747-34e2-48f7-bf21-d17810122954-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1352.821975] env[67899]: DEBUG oslo_concurrency.lockutils [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] Lock "8a157747-34e2-48f7-bf21-d17810122954-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1352.821975] env[67899]: DEBUG oslo_concurrency.lockutils [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] Lock "8a157747-34e2-48f7-bf21-d17810122954-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1352.821975] env[67899]: DEBUG nova.compute.manager [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: 8a157747-34e2-48f7-bf21-d17810122954] No waiting events found dispatching network-vif-plugged-6708207f-73db-4713-91a7-c36ad324bda1 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1352.821975] env[67899]: WARNING nova.compute.manager [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Received unexpected event network-vif-plugged-6708207f-73db-4713-91a7-c36ad324bda1 for instance with vm_state building and task_state spawning. [ 1352.821975] env[67899]: DEBUG nova.compute.manager [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Received event network-changed-6708207f-73db-4713-91a7-c36ad324bda1 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1352.821975] env[67899]: DEBUG nova.compute.manager [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Refreshing instance network info cache due to event network-changed-6708207f-73db-4713-91a7-c36ad324bda1. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1352.821975] env[67899]: DEBUG oslo_concurrency.lockutils [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] Acquiring lock "refresh_cache-8a157747-34e2-48f7-bf21-d17810122954" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1352.821975] env[67899]: DEBUG oslo_concurrency.lockutils [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] Acquired lock "refresh_cache-8a157747-34e2-48f7-bf21-d17810122954" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1352.822227] env[67899]: DEBUG nova.network.neutron [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Refreshing network info cache for port 6708207f-73db-4713-91a7-c36ad324bda1 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1353.782376] env[67899]: DEBUG nova.network.neutron [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Updated VIF entry in instance network info cache for port 6708207f-73db-4713-91a7-c36ad324bda1. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1353.782725] env[67899]: DEBUG nova.network.neutron [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Updating instance_info_cache with network_info: [{"id": "6708207f-73db-4713-91a7-c36ad324bda1", "address": "fa:16:3e:5d:63:6e", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6708207f-73", "ovs_interfaceid": "6708207f-73db-4713-91a7-c36ad324bda1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1353.789993] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467953, 'name': CreateVM_Task, 'duration_secs': 2.087854} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1353.790164] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1353.790767] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1353.790927] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1353.791247] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1353.791500] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6e2796a2-6c52-49f9-bddf-f53eccc4a28f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1353.793863] env[67899]: DEBUG oslo_concurrency.lockutils [req-c656954f-407a-4aff-bcf3-e955bcf59a5e req-64c4fbbe-fffb-43bd-8807-02602e1b1aac service nova] Releasing lock "refresh_cache-8a157747-34e2-48f7-bf21-d17810122954" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1353.797034] env[67899]: DEBUG oslo_vmware.api [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Waiting for the task: (returnval){ [ 1353.797034] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]526bcb99-f976-905e-5ce8-c5ea41ae37c8" [ 1353.797034] env[67899]: _type = "Task" [ 1353.797034] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1353.804785] env[67899]: DEBUG oslo_vmware.api [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]526bcb99-f976-905e-5ce8-c5ea41ae37c8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1354.307509] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1354.307818] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1354.308078] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1363.414669] env[67899]: DEBUG oslo_concurrency.lockutils [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquiring lock "7a82e877-8a39-4684-8b75-711b7bedddac" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1397.111054] env[67899]: WARNING oslo_vmware.rw_handles [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1397.111054] env[67899]: ERROR oslo_vmware.rw_handles [ 1397.111054] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/345c15f9-e2f1-437f-95b5-1cd96c9b4aab/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1397.113761] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1397.114059] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Copying Virtual Disk [datastore1] vmware_temp/345c15f9-e2f1-437f-95b5-1cd96c9b4aab/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/345c15f9-e2f1-437f-95b5-1cd96c9b4aab/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1397.114389] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9571e526-e5c0-410a-8b55-c60af38e802b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.123299] env[67899]: DEBUG oslo_vmware.api [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Waiting for the task: (returnval){ [ 1397.123299] env[67899]: value = "task-3467957" [ 1397.123299] env[67899]: _type = "Task" [ 1397.123299] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1397.130782] env[67899]: DEBUG oslo_vmware.api [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Task: {'id': task-3467957, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1397.633433] env[67899]: DEBUG oslo_vmware.exceptions [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1397.633729] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1397.634330] env[67899]: ERROR nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1397.634330] env[67899]: Faults: ['InvalidArgument'] [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Traceback (most recent call last): [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] yield resources [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] self.driver.spawn(context, instance, image_meta, [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] self._fetch_image_if_missing(context, vi) [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] image_cache(vi, tmp_image_ds_loc) [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] vm_util.copy_virtual_disk( [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] session._wait_for_task(vmdk_copy_task) [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] return self.wait_for_task(task_ref) [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] return evt.wait() [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] result = hub.switch() [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] return self.greenlet.switch() [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] self.f(*self.args, **self.kw) [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] raise exceptions.translate_fault(task_info.error) [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Faults: ['InvalidArgument'] [ 1397.634330] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] [ 1397.635363] env[67899]: INFO nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Terminating instance [ 1397.636837] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1397.636837] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1397.636837] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb96d63d-9380-4fe8-80ec-16b4b232aa28 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.639057] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1397.639394] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1397.640116] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb4a213e-56b4-4ef8-9fcf-4b31e4db9ceb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.646570] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1397.646774] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2688cd9c-0a5f-4615-9526-89350edc55dc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.649184] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1397.649440] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1397.650114] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9e10edae-27f1-4c3a-b1c3-9d74a5a2db60 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.654804] env[67899]: DEBUG oslo_vmware.api [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Waiting for the task: (returnval){ [ 1397.654804] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52bae85b-cf99-2ed5-1fc8-5835504691e9" [ 1397.654804] env[67899]: _type = "Task" [ 1397.654804] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1397.661810] env[67899]: DEBUG oslo_vmware.api [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52bae85b-cf99-2ed5-1fc8-5835504691e9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1397.742638] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1397.742866] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1397.743058] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Deleting the datastore file [datastore1] 37ab08db-50ab-4c30-9e18-05007c5d1c27 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1397.743333] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-63cd4e85-9e2f-4860-881a-356d6a89dd68 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1397.749347] env[67899]: DEBUG oslo_vmware.api [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Waiting for the task: (returnval){ [ 1397.749347] env[67899]: value = "task-3467959" [ 1397.749347] env[67899]: _type = "Task" [ 1397.749347] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1397.757207] env[67899]: DEBUG oslo_vmware.api [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Task: {'id': task-3467959, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1398.166822] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1398.167131] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Creating directory with path [datastore1] vmware_temp/8b23a959-1174-4c26-9836-11a38b3a80db/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1398.167131] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7e8a26ba-b979-413c-b58f-975f47694b33 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.179603] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Created directory with path [datastore1] vmware_temp/8b23a959-1174-4c26-9836-11a38b3a80db/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1398.179811] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Fetch image to [datastore1] vmware_temp/8b23a959-1174-4c26-9836-11a38b3a80db/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1398.180024] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/8b23a959-1174-4c26-9836-11a38b3a80db/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1398.180689] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f69c3ee-d121-4500-967e-f89982dfe263 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.187072] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8d90ab7-bb63-4f3e-9861-33fe202dc63c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.195968] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dd83495-8564-4555-81da-eb23660415c0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.226772] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e55ebea5-3008-403d-ac7a-d40e9fee4d13 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.231887] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f37d8b6e-3493-4434-848b-451fb467d58a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.254557] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1398.260694] env[67899]: DEBUG oslo_vmware.api [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Task: {'id': task-3467959, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063882} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1398.260925] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1398.261120] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1398.261290] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1398.261460] env[67899]: INFO nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1398.263569] env[67899]: DEBUG nova.compute.claims [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1398.263743] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1398.263957] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1398.303807] env[67899]: DEBUG oslo_vmware.rw_handles [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b23a959-1174-4c26-9836-11a38b3a80db/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1398.365787] env[67899]: DEBUG oslo_vmware.rw_handles [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1398.365984] env[67899]: DEBUG oslo_vmware.rw_handles [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b23a959-1174-4c26-9836-11a38b3a80db/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1398.565624] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4446ca76-09ed-47c4-a63b-d2ae67affc47 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.573268] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbfb3946-39e1-4608-828c-581148f1c618 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.603044] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09e9904a-4625-45be-8056-ec10747c4bfe {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.609466] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b889f59-57db-4047-abc3-ecd33e9271f9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.622170] env[67899]: DEBUG nova.compute.provider_tree [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1398.631224] env[67899]: DEBUG nova.scheduler.client.report [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1398.645591] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.382s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1398.646132] env[67899]: ERROR nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1398.646132] env[67899]: Faults: ['InvalidArgument'] [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Traceback (most recent call last): [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] self.driver.spawn(context, instance, image_meta, [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] self._fetch_image_if_missing(context, vi) [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] image_cache(vi, tmp_image_ds_loc) [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] vm_util.copy_virtual_disk( [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] session._wait_for_task(vmdk_copy_task) [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] return self.wait_for_task(task_ref) [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] return evt.wait() [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] result = hub.switch() [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] return self.greenlet.switch() [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] self.f(*self.args, **self.kw) [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] raise exceptions.translate_fault(task_info.error) [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Faults: ['InvalidArgument'] [ 1398.646132] env[67899]: ERROR nova.compute.manager [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] [ 1398.646946] env[67899]: DEBUG nova.compute.utils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1398.648233] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Build of instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 was re-scheduled: A specified parameter was not correct: fileType [ 1398.648233] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1398.648619] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1398.648797] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1398.648964] env[67899]: DEBUG nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1398.649161] env[67899]: DEBUG nova.network.neutron [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1399.114526] env[67899]: DEBUG nova.network.neutron [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1399.127717] env[67899]: INFO nova.compute.manager [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Took 0.48 seconds to deallocate network for instance. [ 1399.224559] env[67899]: INFO nova.scheduler.client.report [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Deleted allocations for instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 [ 1399.248261] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f2dbd61-5a36-44ea-b13c-793c018e3d97 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 629.311s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.249526] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 432.903s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1399.249793] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Acquiring lock "37ab08db-50ab-4c30-9e18-05007c5d1c27-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1399.250011] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1399.250194] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.252516] env[67899]: INFO nova.compute.manager [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Terminating instance [ 1399.254076] env[67899]: DEBUG nova.compute.manager [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1399.254276] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1399.254924] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7eaf4a11-f117-4af0-8b19-fcef21841b2d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.265033] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72565b5c-e3a3-4c6a-9976-4b609a1e135b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.276196] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1399.297946] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 37ab08db-50ab-4c30-9e18-05007c5d1c27 could not be found. [ 1399.298169] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1399.298383] env[67899]: INFO nova.compute.manager [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1399.298630] env[67899]: DEBUG oslo.service.loopingcall [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1399.298847] env[67899]: DEBUG nova.compute.manager [-] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1399.298941] env[67899]: DEBUG nova.network.neutron [-] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1399.321400] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1399.321625] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1399.323084] env[67899]: INFO nova.compute.claims [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1399.325656] env[67899]: DEBUG nova.network.neutron [-] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1399.332911] env[67899]: INFO nova.compute.manager [-] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] Took 0.03 seconds to deallocate network for instance. [ 1399.427148] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4c053ca0-ad20-49b5-8063-64ecb8b0fcd4 tempest-ServerMetadataNegativeTestJSON-1266469450 tempest-ServerMetadataNegativeTestJSON-1266469450-project-member] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.178s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.428294] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 224.408s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1399.428569] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 37ab08db-50ab-4c30-9e18-05007c5d1c27] During sync_power_state the instance has a pending task (deleting). Skip. [ 1399.428811] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "37ab08db-50ab-4c30-9e18-05007c5d1c27" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.588561] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b4de319-7722-46ca-8d5c-7b89520d8b2f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.595899] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b3ce78f-9020-4ee1-9980-8ca04db34ff2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.624839] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6601406-8a83-4548-a16c-249f4a51b358 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.631907] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccbb7f9a-eb3c-4436-b872-141428ce6065 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.644552] env[67899]: DEBUG nova.compute.provider_tree [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1399.653133] env[67899]: DEBUG nova.scheduler.client.report [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1399.666210] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.666681] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1399.701911] env[67899]: DEBUG nova.compute.utils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1399.703949] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1399.703949] env[67899]: DEBUG nova.network.neutron [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1399.713028] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1399.757070] env[67899]: DEBUG nova.policy [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efefb07c7b59480ea902af870f8a3a81', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '367cc1fda0f644c4a22b67636cd3573a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1399.773632] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1399.797690] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1399.797922] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1399.798086] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1399.798292] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1399.798440] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1399.798586] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1399.798785] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1399.798939] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1399.799112] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1399.799306] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1399.799470] env[67899]: DEBUG nova.virt.hardware [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1399.800305] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f755e56d-930f-4a68-a9fa-652c901362cd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.807944] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b13025b-d066-42dc-9e7f-327bda74634f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.081996] env[67899]: DEBUG nova.network.neutron [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Successfully created port: e426cfe9-eeeb-437a-9fe3-608b15f1b8ac {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1400.780467] env[67899]: DEBUG nova.network.neutron [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Successfully updated port: e426cfe9-eeeb-437a-9fe3-608b15f1b8ac {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1400.795631] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquiring lock "refresh_cache-03684169-e2c8-4cf5-8e79-b118725927f1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1400.795795] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquired lock "refresh_cache-03684169-e2c8-4cf5-8e79-b118725927f1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1400.795945] env[67899]: DEBUG nova.network.neutron [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1400.831941] env[67899]: DEBUG nova.network.neutron [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1401.007172] env[67899]: DEBUG nova.network.neutron [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Updating instance_info_cache with network_info: [{"id": "e426cfe9-eeeb-437a-9fe3-608b15f1b8ac", "address": "fa:16:3e:59:da:8a", "network": {"id": "b877e544-5c1f-470d-bfe5-b32a369a2d85", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-2118923995-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "367cc1fda0f644c4a22b67636cd3573a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7869cc8e-e58f-4fd6-88d7-85a18e43cd3a", "external-id": "nsx-vlan-transportzone-927", "segmentation_id": 927, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape426cfe9-ee", "ovs_interfaceid": "e426cfe9-eeeb-437a-9fe3-608b15f1b8ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1401.023303] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Releasing lock "refresh_cache-03684169-e2c8-4cf5-8e79-b118725927f1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1401.023639] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Instance network_info: |[{"id": "e426cfe9-eeeb-437a-9fe3-608b15f1b8ac", "address": "fa:16:3e:59:da:8a", "network": {"id": "b877e544-5c1f-470d-bfe5-b32a369a2d85", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-2118923995-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "367cc1fda0f644c4a22b67636cd3573a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7869cc8e-e58f-4fd6-88d7-85a18e43cd3a", "external-id": "nsx-vlan-transportzone-927", "segmentation_id": 927, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape426cfe9-ee", "ovs_interfaceid": "e426cfe9-eeeb-437a-9fe3-608b15f1b8ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1401.024113] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:59:da:8a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7869cc8e-e58f-4fd6-88d7-85a18e43cd3a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e426cfe9-eeeb-437a-9fe3-608b15f1b8ac', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1401.032089] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Creating folder: Project (367cc1fda0f644c4a22b67636cd3573a). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1401.032674] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d31a1092-2337-4f55-b2e5-5de9c8242860 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.044100] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Created folder: Project (367cc1fda0f644c4a22b67636cd3573a) in parent group-v692900. [ 1401.044304] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Creating folder: Instances. Parent ref: group-v692985. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1401.044564] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1408b3dc-54d4-4a26-9972-4a693ba31b28 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.054224] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Created folder: Instances in parent group-v692985. [ 1401.054224] env[67899]: DEBUG oslo.service.loopingcall [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1401.054937] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1401.054937] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aabc3907-9729-49c0-9623-55a14ba0199f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.074018] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1401.074018] env[67899]: value = "task-3467962" [ 1401.074018] env[67899]: _type = "Task" [ 1401.074018] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1401.081708] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467962, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1401.150754] env[67899]: DEBUG nova.compute.manager [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Received event network-vif-plugged-e426cfe9-eeeb-437a-9fe3-608b15f1b8ac {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1401.151019] env[67899]: DEBUG oslo_concurrency.lockutils [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] Acquiring lock "03684169-e2c8-4cf5-8e79-b118725927f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1401.151193] env[67899]: DEBUG oslo_concurrency.lockutils [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] Lock "03684169-e2c8-4cf5-8e79-b118725927f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.151360] env[67899]: DEBUG oslo_concurrency.lockutils [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] Lock "03684169-e2c8-4cf5-8e79-b118725927f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.151522] env[67899]: DEBUG nova.compute.manager [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] No waiting events found dispatching network-vif-plugged-e426cfe9-eeeb-437a-9fe3-608b15f1b8ac {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1401.151682] env[67899]: WARNING nova.compute.manager [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Received unexpected event network-vif-plugged-e426cfe9-eeeb-437a-9fe3-608b15f1b8ac for instance with vm_state building and task_state spawning. [ 1401.151840] env[67899]: DEBUG nova.compute.manager [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Received event network-changed-e426cfe9-eeeb-437a-9fe3-608b15f1b8ac {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1401.151992] env[67899]: DEBUG nova.compute.manager [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Refreshing instance network info cache due to event network-changed-e426cfe9-eeeb-437a-9fe3-608b15f1b8ac. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1401.152196] env[67899]: DEBUG oslo_concurrency.lockutils [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] Acquiring lock "refresh_cache-03684169-e2c8-4cf5-8e79-b118725927f1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1401.152334] env[67899]: DEBUG oslo_concurrency.lockutils [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] Acquired lock "refresh_cache-03684169-e2c8-4cf5-8e79-b118725927f1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1401.152484] env[67899]: DEBUG nova.network.neutron [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Refreshing network info cache for port e426cfe9-eeeb-437a-9fe3-608b15f1b8ac {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1401.421575] env[67899]: DEBUG nova.network.neutron [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Updated VIF entry in instance network info cache for port e426cfe9-eeeb-437a-9fe3-608b15f1b8ac. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1401.421951] env[67899]: DEBUG nova.network.neutron [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Updating instance_info_cache with network_info: [{"id": "e426cfe9-eeeb-437a-9fe3-608b15f1b8ac", "address": "fa:16:3e:59:da:8a", "network": {"id": "b877e544-5c1f-470d-bfe5-b32a369a2d85", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-2118923995-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "367cc1fda0f644c4a22b67636cd3573a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7869cc8e-e58f-4fd6-88d7-85a18e43cd3a", "external-id": "nsx-vlan-transportzone-927", "segmentation_id": 927, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape426cfe9-ee", "ovs_interfaceid": "e426cfe9-eeeb-437a-9fe3-608b15f1b8ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1401.430945] env[67899]: DEBUG oslo_concurrency.lockutils [req-fe46726a-142d-4766-a118-b3d09a565106 req-56cc0859-45d8-4536-be90-fc78707fc149 service nova] Releasing lock "refresh_cache-03684169-e2c8-4cf5-8e79-b118725927f1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1401.583829] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467962, 'name': CreateVM_Task, 'duration_secs': 0.298359} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1401.583973] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1401.584641] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1401.584799] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1401.585116] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1401.585352] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3e936c15-0665-4079-88b2-adcbe7aadfb7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.589664] env[67899]: DEBUG oslo_vmware.api [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Waiting for the task: (returnval){ [ 1401.589664] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52b64a3e-699e-e1e4-00a7-6b05ebba8373" [ 1401.589664] env[67899]: _type = "Task" [ 1401.589664] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1401.597682] env[67899]: DEBUG oslo_vmware.api [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52b64a3e-699e-e1e4-00a7-6b05ebba8373, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1401.874587] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1401.995945] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1401.996148] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1401.996309] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1402.017955] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.018129] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.018286] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.018431] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.018554] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.018675] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.018794] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.018910] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.019037] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.019156] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1402.019274] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1402.099810] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1402.099810] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1402.099986] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1402.996576] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1403.997313] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1403.997560] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1404.997715] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1406.991350] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1407.015235] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1407.015419] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1408.996916] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1410.996660] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1411.008562] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1411.008877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1411.009073] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.009483] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1411.010336] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abbfa9be-fa38-4e31-adc7-c4cfef1df041 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.019020] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b4ebcc4-e8b8-429c-ba18-bfadcded128d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.032655] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-749453c8-90cd-4a1b-923d-93280cd39359 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.038718] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cce818c-cbc7-4a92-8cad-8e062ea4624a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.066985] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180942MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1411.067161] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1411.067347] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1411.140253] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.140418] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.140548] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.140676] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.140788] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.140904] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.141030] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a82e877-8a39-4684-8b75-711b7bedddac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.141150] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.141263] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8a157747-34e2-48f7-bf21-d17810122954 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.141376] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1411.151816] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 49c65e6c-9e16-40f5-9754-fe81681f9714 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1411.161334] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ac2f9cf9-f573-4f21-aeb4-6cea5c94f843 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1411.170176] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5bb22bfa-4f1f-42a8-a7e3-5e806c70ae45 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1411.179525] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1411.188569] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 53a54716-a3cd-4234-977d-0c82370025d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1411.197916] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 5d8b3009-ba5e-4f29-81fc-6d389ec30808 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1411.207194] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a292a68e-deff-465b-81f0-727e75c2e212 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1411.207410] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1411.207558] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1411.396723] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8aeebd9d-5bb3-4d6b-83c6-d90d6f7f7c69 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.404206] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e64b0dc1-3017-4590-bd14-16de4fb56b07 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.433217] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32e650f1-372d-487e-af9f-c663681c2772 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.439781] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7beb84d9-1885-41cc-8d5b-0c822d7f5c25 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.452310] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1411.460177] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1411.474620] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1411.474798] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.407s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1418.632373] env[67899]: DEBUG oslo_concurrency.lockutils [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquiring lock "dc7bf2b7-631d-4933-92db-1679ad823379" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1419.732166] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquiring lock "8a157747-34e2-48f7-bf21-d17810122954" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1427.236053] env[67899]: DEBUG oslo_concurrency.lockutils [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquiring lock "03684169-e2c8-4cf5-8e79-b118725927f1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1436.438582] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "e179db1d-ee0c-4f47-a958-40dd69209d26" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1436.438913] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1441.066108] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "addcc88a-6bb5-4a70-938e-49c0c79c8414" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1441.066443] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1446.790124] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "a6544af8-879d-4c45-bee4-8551b861fc66" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1446.790574] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a6544af8-879d-4c45-bee4-8551b861fc66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1446.822926] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "c56980f8-68e2-4501-a6a9-b713b208f895" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1446.822926] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "c56980f8-68e2-4501-a6a9-b713b208f895" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1448.293238] env[67899]: WARNING oslo_vmware.rw_handles [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1448.293238] env[67899]: ERROR oslo_vmware.rw_handles [ 1448.293815] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/8b23a959-1174-4c26-9836-11a38b3a80db/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1448.295615] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1448.295884] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Copying Virtual Disk [datastore1] vmware_temp/8b23a959-1174-4c26-9836-11a38b3a80db/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/8b23a959-1174-4c26-9836-11a38b3a80db/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1448.296189] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dca5030c-8aee-4f34-8c4c-f0643e7297f2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.304715] env[67899]: DEBUG oslo_vmware.api [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Waiting for the task: (returnval){ [ 1448.304715] env[67899]: value = "task-3467963" [ 1448.304715] env[67899]: _type = "Task" [ 1448.304715] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1448.313087] env[67899]: DEBUG oslo_vmware.api [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Task: {'id': task-3467963, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1448.815322] env[67899]: DEBUG oslo_vmware.exceptions [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1448.815615] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1448.816200] env[67899]: ERROR nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1448.816200] env[67899]: Faults: ['InvalidArgument'] [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Traceback (most recent call last): [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] yield resources [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] self.driver.spawn(context, instance, image_meta, [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] self._fetch_image_if_missing(context, vi) [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] image_cache(vi, tmp_image_ds_loc) [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] vm_util.copy_virtual_disk( [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] session._wait_for_task(vmdk_copy_task) [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] return self.wait_for_task(task_ref) [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] return evt.wait() [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] result = hub.switch() [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] return self.greenlet.switch() [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] self.f(*self.args, **self.kw) [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] raise exceptions.translate_fault(task_info.error) [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Faults: ['InvalidArgument'] [ 1448.816200] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] [ 1448.817293] env[67899]: INFO nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Terminating instance [ 1448.818368] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1448.818576] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1448.819287] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1448.819557] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1448.819796] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c08dca1a-f8c0-4e12-8941-9b13001ad752 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.822217] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-784faa42-b569-44fd-a3a6-3e1e5207df24 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.829676] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1448.830795] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7e71eab9-b13a-44f6-b732-15606b6f20f5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.832282] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1448.832455] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1448.833182] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-61ee2b27-f5d0-4de5-bea5-cea9b6a8c175 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.838056] env[67899]: DEBUG oslo_vmware.api [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Waiting for the task: (returnval){ [ 1448.838056] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]528a7ca8-e0d8-e20e-dcc3-b20c44f448a4" [ 1448.838056] env[67899]: _type = "Task" [ 1448.838056] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1448.845592] env[67899]: DEBUG oslo_vmware.api [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]528a7ca8-e0d8-e20e-dcc3-b20c44f448a4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1448.912455] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1448.912634] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1448.912806] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Deleting the datastore file [datastore1] 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1448.913143] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3d36f35e-99f8-49cf-9d87-dd7fd5a9e450 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.919461] env[67899]: DEBUG oslo_vmware.api [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Waiting for the task: (returnval){ [ 1448.919461] env[67899]: value = "task-3467965" [ 1448.919461] env[67899]: _type = "Task" [ 1448.919461] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1448.927013] env[67899]: DEBUG oslo_vmware.api [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Task: {'id': task-3467965, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1449.348473] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1449.348816] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Creating directory with path [datastore1] vmware_temp/95c50e09-508e-4ea2-9d5d-c12a4fc20ee4/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1449.349043] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4927b43c-cafc-468c-a45e-d36a6d1fc1de {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.360589] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Created directory with path [datastore1] vmware_temp/95c50e09-508e-4ea2-9d5d-c12a4fc20ee4/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1449.360782] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Fetch image to [datastore1] vmware_temp/95c50e09-508e-4ea2-9d5d-c12a4fc20ee4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1449.360973] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/95c50e09-508e-4ea2-9d5d-c12a4fc20ee4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1449.361707] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85f88ea6-f557-4963-96db-c5025f7b4073 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.368015] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b7fdf7c-612e-43ad-a860-d4f543c61049 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.376904] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2667fd31-db0a-4c00-b3c1-0bdb0b21b166 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.407861] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdd717d3-dfcc-4046-8646-b720a8f52256 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.413419] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d450ebd2-0adb-4f79-bf42-ec6600ae30dd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.427733] env[67899]: DEBUG oslo_vmware.api [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Task: {'id': task-3467965, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075239} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1449.427970] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1449.428194] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1449.428410] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1449.428597] env[67899]: INFO nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1449.430690] env[67899]: DEBUG nova.compute.claims [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1449.430867] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1449.431087] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1449.434930] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1449.483016] env[67899]: DEBUG oslo_vmware.rw_handles [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/95c50e09-508e-4ea2-9d5d-c12a4fc20ee4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1449.542534] env[67899]: DEBUG oslo_vmware.rw_handles [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1449.542737] env[67899]: DEBUG oslo_vmware.rw_handles [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/95c50e09-508e-4ea2-9d5d-c12a4fc20ee4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1449.716512] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a966db-f83d-4a8c-93b0-3247fddcba60 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.723974] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d973fe6-d5f2-464a-aabf-ac314ee8400f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.753336] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4d24fcb-b3b4-423a-ac2e-982891b9f55f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.760179] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-804f8995-e6df-4e40-84f5-39d9e4fa5cbb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.773382] env[67899]: DEBUG nova.compute.provider_tree [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1449.781935] env[67899]: DEBUG nova.scheduler.client.report [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1449.795154] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.364s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1449.795634] env[67899]: ERROR nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1449.795634] env[67899]: Faults: ['InvalidArgument'] [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Traceback (most recent call last): [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] self.driver.spawn(context, instance, image_meta, [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] self._fetch_image_if_missing(context, vi) [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] image_cache(vi, tmp_image_ds_loc) [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] vm_util.copy_virtual_disk( [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] session._wait_for_task(vmdk_copy_task) [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] return self.wait_for_task(task_ref) [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] return evt.wait() [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] result = hub.switch() [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] return self.greenlet.switch() [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] self.f(*self.args, **self.kw) [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] raise exceptions.translate_fault(task_info.error) [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Faults: ['InvalidArgument'] [ 1449.795634] env[67899]: ERROR nova.compute.manager [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] [ 1449.796490] env[67899]: DEBUG nova.compute.utils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1449.797644] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Build of instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 was re-scheduled: A specified parameter was not correct: fileType [ 1449.797644] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1449.798032] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1449.798185] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1449.798355] env[67899]: DEBUG nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1449.798515] env[67899]: DEBUG nova.network.neutron [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1450.314076] env[67899]: DEBUG nova.network.neutron [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1450.332214] env[67899]: INFO nova.compute.manager [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Took 0.53 seconds to deallocate network for instance. [ 1450.439758] env[67899]: INFO nova.scheduler.client.report [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Deleted allocations for instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 [ 1450.459866] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6a55d310-17fb-4564-8f2a-4ee92b2e7084 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 632.297s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1450.460985] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 435.096s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1450.461229] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Acquiring lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1450.461443] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1450.461607] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1450.464760] env[67899]: INFO nova.compute.manager [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Terminating instance [ 1450.470686] env[67899]: DEBUG nova.compute.manager [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1450.470946] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1450.471515] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a0571414-45a0-4ff7-895c-ddd6085f4c73 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.481753] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2454190-d7bd-4217-b0d8-8ff8dbe4c49f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.493669] env[67899]: DEBUG nova.compute.manager [None req-65de0df3-895b-4fbd-8d87-17ee086072d5 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 49c65e6c-9e16-40f5-9754-fe81681f9714] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1450.514794] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4 could not be found. [ 1450.515029] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1450.515245] env[67899]: INFO nova.compute.manager [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1450.515495] env[67899]: DEBUG oslo.service.loopingcall [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1450.515727] env[67899]: DEBUG nova.compute.manager [-] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1450.515826] env[67899]: DEBUG nova.network.neutron [-] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1450.519592] env[67899]: DEBUG nova.compute.manager [None req-65de0df3-895b-4fbd-8d87-17ee086072d5 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 49c65e6c-9e16-40f5-9754-fe81681f9714] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1450.541197] env[67899]: DEBUG nova.network.neutron [-] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1450.543372] env[67899]: DEBUG oslo_concurrency.lockutils [None req-65de0df3-895b-4fbd-8d87-17ee086072d5 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "49c65e6c-9e16-40f5-9754-fe81681f9714" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.208s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1450.552690] env[67899]: DEBUG nova.compute.manager [None req-e46202ba-29e3-4bfd-9e7d-4f47bc76deae tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] [instance: ac2f9cf9-f573-4f21-aeb4-6cea5c94f843] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1450.564429] env[67899]: INFO nova.compute.manager [-] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] Took 0.05 seconds to deallocate network for instance. [ 1450.575418] env[67899]: DEBUG nova.compute.manager [None req-e46202ba-29e3-4bfd-9e7d-4f47bc76deae tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] [instance: ac2f9cf9-f573-4f21-aeb4-6cea5c94f843] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1450.594061] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e46202ba-29e3-4bfd-9e7d-4f47bc76deae tempest-DeleteServersAdminTestJSON-329788213 tempest-DeleteServersAdminTestJSON-329788213-project-member] Lock "ac2f9cf9-f573-4f21-aeb4-6cea5c94f843" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.483s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1450.603036] env[67899]: DEBUG nova.compute.manager [None req-82dbbae0-b072-4fbd-860f-b156dd250541 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: 5bb22bfa-4f1f-42a8-a7e3-5e806c70ae45] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1450.625849] env[67899]: DEBUG nova.compute.manager [None req-82dbbae0-b072-4fbd-860f-b156dd250541 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: 5bb22bfa-4f1f-42a8-a7e3-5e806c70ae45] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1450.644187] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cfd44001-16ed-4e48-a22e-fe9b4ad3c923 tempest-ListServersNegativeTestJSON-1933841315 tempest-ListServersNegativeTestJSON-1933841315-project-member] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.183s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1450.645235] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 275.625s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1450.645446] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4c50d280-8833-4c3f-8571-8ffa5c7ea2a4] During sync_power_state the instance has a pending task (deleting). Skip. [ 1450.645642] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "4c50d280-8833-4c3f-8571-8ffa5c7ea2a4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1450.646908] env[67899]: DEBUG oslo_concurrency.lockutils [None req-82dbbae0-b072-4fbd-860f-b156dd250541 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "5bb22bfa-4f1f-42a8-a7e3-5e806c70ae45" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.350s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1450.656499] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1450.704187] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1450.704442] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1450.705871] env[67899]: INFO nova.compute.claims [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1450.941300] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02d6c3a6-b5c7-4652-af80-fc40db124fb3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.949407] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fc3db41-89e3-42da-a8ec-e694f9b1c639 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.979426] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d42939ec-c7eb-490b-b1c4-b66c9e37c485 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1450.986669] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf86b5ed-82a4-4740-af95-82636fcb9cd6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.000215] env[67899]: DEBUG nova.compute.provider_tree [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1451.008550] env[67899]: DEBUG nova.scheduler.client.report [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1451.021978] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1451.022473] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1451.056263] env[67899]: DEBUG nova.compute.utils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1451.057756] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1451.057930] env[67899]: DEBUG nova.network.neutron [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1451.065724] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1451.131984] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1451.158149] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1451.158414] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1451.158572] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1451.158812] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1451.158977] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1451.159146] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1451.159361] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1451.159518] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1451.159683] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1451.159895] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1451.160087] env[67899]: DEBUG nova.virt.hardware [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1451.160927] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-799bf30d-5d02-4e21-b3e2-b728ca36a904 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.164550] env[67899]: DEBUG nova.policy [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '251e654a527e4f748af317abce58259d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '01e27e2f5cff44ce9fa07e0f7206708a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1451.171601] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2c02ab4-ccd2-401f-9c04-ef3a574340c0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1451.583729] env[67899]: DEBUG nova.network.neutron [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Successfully created port: 8b9c5734-85d2-4b4f-8aea-b28e93fce565 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1452.366305] env[67899]: DEBUG nova.network.neutron [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Successfully updated port: 8b9c5734-85d2-4b4f-8aea-b28e93fce565 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1452.377040] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquiring lock "refresh_cache-3a077713-f7a2-4a61-bb17-987af6a52c4a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1452.377209] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquired lock "refresh_cache-3a077713-f7a2-4a61-bb17-987af6a52c4a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1452.377358] env[67899]: DEBUG nova.network.neutron [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1452.439393] env[67899]: DEBUG nova.compute.manager [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Received event network-vif-plugged-8b9c5734-85d2-4b4f-8aea-b28e93fce565 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1452.439636] env[67899]: DEBUG oslo_concurrency.lockutils [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] Acquiring lock "3a077713-f7a2-4a61-bb17-987af6a52c4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1452.439920] env[67899]: DEBUG oslo_concurrency.lockutils [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1452.440211] env[67899]: DEBUG oslo_concurrency.lockutils [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.440329] env[67899]: DEBUG nova.compute.manager [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] No waiting events found dispatching network-vif-plugged-8b9c5734-85d2-4b4f-8aea-b28e93fce565 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1452.440436] env[67899]: WARNING nova.compute.manager [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Received unexpected event network-vif-plugged-8b9c5734-85d2-4b4f-8aea-b28e93fce565 for instance with vm_state building and task_state spawning. [ 1452.440602] env[67899]: DEBUG nova.compute.manager [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Received event network-changed-8b9c5734-85d2-4b4f-8aea-b28e93fce565 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1452.440830] env[67899]: DEBUG nova.compute.manager [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Refreshing instance network info cache due to event network-changed-8b9c5734-85d2-4b4f-8aea-b28e93fce565. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1452.441034] env[67899]: DEBUG oslo_concurrency.lockutils [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] Acquiring lock "refresh_cache-3a077713-f7a2-4a61-bb17-987af6a52c4a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1452.444013] env[67899]: DEBUG nova.network.neutron [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1452.644965] env[67899]: DEBUG nova.network.neutron [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Updating instance_info_cache with network_info: [{"id": "8b9c5734-85d2-4b4f-8aea-b28e93fce565", "address": "fa:16:3e:00:64:35", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8b9c5734-85", "ovs_interfaceid": "8b9c5734-85d2-4b4f-8aea-b28e93fce565", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1452.657814] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Releasing lock "refresh_cache-3a077713-f7a2-4a61-bb17-987af6a52c4a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1452.658121] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Instance network_info: |[{"id": "8b9c5734-85d2-4b4f-8aea-b28e93fce565", "address": "fa:16:3e:00:64:35", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8b9c5734-85", "ovs_interfaceid": "8b9c5734-85d2-4b4f-8aea-b28e93fce565", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1452.658528] env[67899]: DEBUG oslo_concurrency.lockutils [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] Acquired lock "refresh_cache-3a077713-f7a2-4a61-bb17-987af6a52c4a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1452.658806] env[67899]: DEBUG nova.network.neutron [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Refreshing network info cache for port 8b9c5734-85d2-4b4f-8aea-b28e93fce565 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1452.659819] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:00:64:35', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8b9c5734-85d2-4b4f-8aea-b28e93fce565', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1452.667422] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Creating folder: Project (01e27e2f5cff44ce9fa07e0f7206708a). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1452.670465] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-627744c3-d04b-4ad5-8fc9-afd6ddb3c659 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.682344] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Created folder: Project (01e27e2f5cff44ce9fa07e0f7206708a) in parent group-v692900. [ 1452.682607] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Creating folder: Instances. Parent ref: group-v692988. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1452.682835] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fa7ea4d1-07b5-4e28-8229-2bd056a72a87 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.691944] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Created folder: Instances in parent group-v692988. [ 1452.692202] env[67899]: DEBUG oslo.service.loopingcall [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1452.692393] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1452.692592] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f1d89003-0aac-4a62-b3e0-247b0c910c6b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.712795] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1452.712795] env[67899]: value = "task-3467968" [ 1452.712795] env[67899]: _type = "Task" [ 1452.712795] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1452.720120] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467968, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1452.985880] env[67899]: DEBUG nova.network.neutron [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Updated VIF entry in instance network info cache for port 8b9c5734-85d2-4b4f-8aea-b28e93fce565. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1452.986341] env[67899]: DEBUG nova.network.neutron [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Updating instance_info_cache with network_info: [{"id": "8b9c5734-85d2-4b4f-8aea-b28e93fce565", "address": "fa:16:3e:00:64:35", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8b9c5734-85", "ovs_interfaceid": "8b9c5734-85d2-4b4f-8aea-b28e93fce565", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1452.995490] env[67899]: DEBUG oslo_concurrency.lockutils [req-7bea5633-a615-4c07-9c39-0479835c1f11 req-836f2b41-aaf3-4cc9-856a-19a4d200b9e2 service nova] Releasing lock "refresh_cache-3a077713-f7a2-4a61-bb17-987af6a52c4a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1453.222847] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467968, 'name': CreateVM_Task} progress is 99%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1453.725838] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467968, 'name': CreateVM_Task} progress is 99%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1454.224923] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467968, 'name': CreateVM_Task, 'duration_secs': 1.285574} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1454.225116] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1454.225774] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1454.225933] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1454.226280] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1454.226532] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dace2bae-20cc-4a65-b296-f94642d009bb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1454.231230] env[67899]: DEBUG oslo_vmware.api [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Waiting for the task: (returnval){ [ 1454.231230] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d9516a-78bd-6708-2c04-2c0598fdb49b" [ 1454.231230] env[67899]: _type = "Task" [ 1454.231230] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1454.242427] env[67899]: DEBUG oslo_vmware.api [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d9516a-78bd-6708-2c04-2c0598fdb49b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1454.661307] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1454.661444] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1454.742020] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1454.742020] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1454.742386] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1455.520982] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquiring lock "3a077713-f7a2-4a61-bb17-987af6a52c4a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1455.996787] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1461.000231] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1463.996369] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1463.996637] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1463.996677] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1464.019296] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.019456] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.019587] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.019715] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.019839] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.019992] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.020156] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.020280] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.020397] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.020511] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1464.020628] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1464.996692] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1465.996230] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1465.996492] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1465.996665] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1465.996825] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances with incomplete migration {{(pid=67899) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1467.006683] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1467.007042] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1467.007202] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1467.018859] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] There are 0 instances to clean {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1468.009013] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1468.009013] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1468.996781] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1472.997986] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1473.009350] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1473.009557] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1473.009719] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1473.009874] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1473.011054] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0f50f64-a069-440d-bae4-43e365d19703 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1473.021151] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad5f0e97-2c08-418a-8f98-5435d3c152a0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1473.034928] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c54227dd-286e-41b3-934f-e686cbb4e629 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1473.041009] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86e5557e-64e4-4e94-b6b8-ccba12facb25 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1473.069655] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180928MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1473.069807] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1473.069974] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1473.236048] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance b9282eeb-09db-4138-a1f0-9e03828021b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.236238] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.236370] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.236524] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.236652] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.236772] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a82e877-8a39-4684-8b75-711b7bedddac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.236903] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.237073] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8a157747-34e2-48f7-bf21-d17810122954 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.237196] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.237311] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1473.249290] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a292a68e-deff-465b-81f0-727e75c2e212 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1473.259364] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e179db1d-ee0c-4f47-a958-40dd69209d26 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1473.269285] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1473.279057] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1473.288789] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c56980f8-68e2-4501-a6a9-b713b208f895 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1473.298651] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1473.298873] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1473.299064] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1473.314506] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing inventories for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1473.328587] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating ProviderTree inventory for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1473.328587] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating inventory in ProviderTree for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1473.339818] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing aggregate associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, aggregates: None {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1473.358358] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing trait associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, traits: COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1473.539164] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3783a269-697d-4448-94cc-ede999b51f75 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1473.548108] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a323a7a3-fd76-49f0-91d5-b4b8128c7893 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1473.577703] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bf36f25-094a-480c-9ea1-e7b8a6f9de18 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1473.584599] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8b050f9-523d-4301-8391-83f6ae188a89 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1473.597591] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1473.605760] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1473.618890] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1473.619105] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.549s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1483.809102] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c8d97b42-c6d1-4386-8857-f061e595e961 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "cd4ae8d3-63d9-463d-9428-fa2c1e8d1679" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1483.809403] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c8d97b42-c6d1-4386-8857-f061e595e961 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "cd4ae8d3-63d9-463d-9428-fa2c1e8d1679" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1498.312564] env[67899]: WARNING oslo_vmware.rw_handles [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1498.312564] env[67899]: ERROR oslo_vmware.rw_handles [ 1498.313130] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/95c50e09-508e-4ea2-9d5d-c12a4fc20ee4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1498.314812] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1498.315061] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Copying Virtual Disk [datastore1] vmware_temp/95c50e09-508e-4ea2-9d5d-c12a4fc20ee4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/95c50e09-508e-4ea2-9d5d-c12a4fc20ee4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1498.315359] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ea9671f9-36e1-4e6b-b102-235a148cb18e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.323659] env[67899]: DEBUG oslo_vmware.api [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Waiting for the task: (returnval){ [ 1498.323659] env[67899]: value = "task-3467969" [ 1498.323659] env[67899]: _type = "Task" [ 1498.323659] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1498.332894] env[67899]: DEBUG oslo_vmware.api [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Task: {'id': task-3467969, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1498.833768] env[67899]: DEBUG oslo_vmware.exceptions [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1498.834070] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1498.834627] env[67899]: ERROR nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1498.834627] env[67899]: Faults: ['InvalidArgument'] [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Traceback (most recent call last): [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] yield resources [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self.driver.spawn(context, instance, image_meta, [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._fetch_image_if_missing(context, vi) [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] image_cache(vi, tmp_image_ds_loc) [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] vm_util.copy_virtual_disk( [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] session._wait_for_task(vmdk_copy_task) [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.wait_for_task(task_ref) [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return evt.wait() [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] result = hub.switch() [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.greenlet.switch() [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self.f(*self.args, **self.kw) [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] raise exceptions.translate_fault(task_info.error) [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Faults: ['InvalidArgument'] [ 1498.834627] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1498.835658] env[67899]: INFO nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Terminating instance [ 1498.836813] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1498.836813] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1498.836982] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-38e6a791-2c7a-40f1-83e5-753190c51e4b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.839281] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1498.839470] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1498.840207] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faeeda94-b236-468f-86f4-581d55b94ebc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.846784] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1498.846993] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4aff8b5e-947e-4b95-b59a-d8570ab6ddc1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.849091] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1498.849290] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1498.850218] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2dbc78a7-268e-4d4d-bf2e-d8ad32e22518 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.855672] env[67899]: DEBUG oslo_vmware.api [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Waiting for the task: (returnval){ [ 1498.855672] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]528a20e8-e0fa-b960-65b6-260477d092df" [ 1498.855672] env[67899]: _type = "Task" [ 1498.855672] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1498.865230] env[67899]: DEBUG oslo_vmware.api [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]528a20e8-e0fa-b960-65b6-260477d092df, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1498.919981] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1498.920197] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1498.920419] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Deleting the datastore file [datastore1] b9282eeb-09db-4138-a1f0-9e03828021b8 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1498.920684] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8ff39606-adf9-4eb6-a834-2bcaa62aefa6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.927280] env[67899]: DEBUG oslo_vmware.api [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Waiting for the task: (returnval){ [ 1498.927280] env[67899]: value = "task-3467971" [ 1498.927280] env[67899]: _type = "Task" [ 1498.927280] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1498.934762] env[67899]: DEBUG oslo_vmware.api [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Task: {'id': task-3467971, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1499.365876] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1499.366186] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Creating directory with path [datastore1] vmware_temp/cbc59dd1-3e20-4d1a-8828-f206a6048ae1/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1499.366296] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c8446231-962f-466a-a996-ba4ecfd06d34 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.377218] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Created directory with path [datastore1] vmware_temp/cbc59dd1-3e20-4d1a-8828-f206a6048ae1/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1499.377433] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Fetch image to [datastore1] vmware_temp/cbc59dd1-3e20-4d1a-8828-f206a6048ae1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1499.377599] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/cbc59dd1-3e20-4d1a-8828-f206a6048ae1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1499.378359] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe621a45-5132-4b7f-8f84-92367d3f8ccf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.384913] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00d05f2b-9ba4-407f-ade9-be16ebb650a4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.394723] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50b3bc60-5fc7-4104-b94f-c34e61f57ba9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.425808] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e1a1f5d-5842-44df-a793-17e3ee0ad49e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.437380] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-76b57f3b-0407-4f01-9725-b26ce5f725b6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.439269] env[67899]: DEBUG oslo_vmware.api [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Task: {'id': task-3467971, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070728} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1499.439524] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1499.439774] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1499.439916] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1499.440036] env[67899]: INFO nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1499.442558] env[67899]: DEBUG nova.compute.claims [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1499.442675] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1499.442877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1499.460078] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1499.516816] env[67899]: DEBUG oslo_vmware.rw_handles [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cbc59dd1-3e20-4d1a-8828-f206a6048ae1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1499.576230] env[67899]: DEBUG oslo_vmware.rw_handles [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1499.576427] env[67899]: DEBUG oslo_vmware.rw_handles [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cbc59dd1-3e20-4d1a-8828-f206a6048ae1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1499.732170] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28402397-bed4-4d98-a822-329d1d2dfb43 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.739903] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f136fac-78f7-4417-9e7c-c764582ba17a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.769305] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dac3a522-f37f-495c-a0d3-ef688419e63e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.775885] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-105a6f10-1c5e-492f-82d6-3227c517b229 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.788612] env[67899]: DEBUG nova.compute.provider_tree [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1499.797378] env[67899]: DEBUG nova.scheduler.client.report [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1499.813812] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.371s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1499.814347] env[67899]: ERROR nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1499.814347] env[67899]: Faults: ['InvalidArgument'] [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Traceback (most recent call last): [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self.driver.spawn(context, instance, image_meta, [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._fetch_image_if_missing(context, vi) [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] image_cache(vi, tmp_image_ds_loc) [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] vm_util.copy_virtual_disk( [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] session._wait_for_task(vmdk_copy_task) [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.wait_for_task(task_ref) [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return evt.wait() [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] result = hub.switch() [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.greenlet.switch() [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self.f(*self.args, **self.kw) [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] raise exceptions.translate_fault(task_info.error) [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Faults: ['InvalidArgument'] [ 1499.814347] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1499.815291] env[67899]: DEBUG nova.compute.utils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1499.816370] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Build of instance b9282eeb-09db-4138-a1f0-9e03828021b8 was re-scheduled: A specified parameter was not correct: fileType [ 1499.816370] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1499.816773] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1499.817022] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1499.817192] env[67899]: DEBUG nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1499.817355] env[67899]: DEBUG nova.network.neutron [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1499.913380] env[67899]: DEBUG neutronclient.v2_0.client [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67899) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1499.914549] env[67899]: ERROR nova.compute.manager [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Traceback (most recent call last): [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self.driver.spawn(context, instance, image_meta, [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._fetch_image_if_missing(context, vi) [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] image_cache(vi, tmp_image_ds_loc) [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] vm_util.copy_virtual_disk( [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] session._wait_for_task(vmdk_copy_task) [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.wait_for_task(task_ref) [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return evt.wait() [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] result = hub.switch() [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.greenlet.switch() [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self.f(*self.args, **self.kw) [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] raise exceptions.translate_fault(task_info.error) [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Faults: ['InvalidArgument'] [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] During handling of the above exception, another exception occurred: [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Traceback (most recent call last): [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._build_and_run_instance(context, instance, image, [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] raise exception.RescheduledException( [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] nova.exception.RescheduledException: Build of instance b9282eeb-09db-4138-a1f0-9e03828021b8 was re-scheduled: A specified parameter was not correct: fileType [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Faults: ['InvalidArgument'] [ 1499.914549] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] During handling of the above exception, another exception occurred: [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Traceback (most recent call last): [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] exception_handler_v20(status_code, error_body) [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] raise client_exc(message=error_message, [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Neutron server returns request_ids: ['req-d3f6601a-39cd-4863-b7f8-fc2cf5f51826'] [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] During handling of the above exception, another exception occurred: [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Traceback (most recent call last): [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._deallocate_network(context, instance, requested_networks) [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self.network_api.deallocate_for_instance( [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] data = neutron.list_ports(**search_opts) [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.list('ports', self.ports_path, retrieve_all, [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] for r in self._pagination(collection, path, **params): [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] res = self.get(path, params=params) [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.retry_request("GET", action, body=body, [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1499.915868] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.do_request(method, action, body=body, [ 1499.916962] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1499.916962] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1499.916962] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1499.916962] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._handle_fault_response(status_code, replybody, resp) [ 1499.916962] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1499.916962] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] raise exception.Unauthorized() [ 1499.916962] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] nova.exception.Unauthorized: Not authorized. [ 1499.916962] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1499.968598] env[67899]: INFO nova.scheduler.client.report [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Deleted allocations for instance b9282eeb-09db-4138-a1f0-9e03828021b8 [ 1499.986880] env[67899]: DEBUG oslo_concurrency.lockutils [None req-92a7a379-f753-4ac6-96db-b7811ca464e0 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 641.124s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1499.987959] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 443.938s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1499.988559] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquiring lock "b9282eeb-09db-4138-a1f0-9e03828021b8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1499.988787] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1499.988960] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1499.990859] env[67899]: INFO nova.compute.manager [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Terminating instance [ 1499.992378] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquiring lock "refresh_cache-b9282eeb-09db-4138-a1f0-9e03828021b8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1499.992537] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Acquired lock "refresh_cache-b9282eeb-09db-4138-a1f0-9e03828021b8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1499.992704] env[67899]: DEBUG nova.network.neutron [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1499.998796] env[67899]: DEBUG nova.compute.manager [None req-67bbdfd5-5385-4e85-b8cb-1f97b40f5bad tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 53a54716-a3cd-4234-977d-0c82370025d1] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1500.021243] env[67899]: DEBUG nova.compute.manager [None req-67bbdfd5-5385-4e85-b8cb-1f97b40f5bad tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 53a54716-a3cd-4234-977d-0c82370025d1] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1500.043823] env[67899]: DEBUG oslo_concurrency.lockutils [None req-67bbdfd5-5385-4e85-b8cb-1f97b40f5bad tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "53a54716-a3cd-4234-977d-0c82370025d1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.885s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.053023] env[67899]: DEBUG nova.compute.manager [None req-3149d882-2ab5-4849-a2dc-744547851dfa tempest-ServerActionsTestOtherB-1017232890 tempest-ServerActionsTestOtherB-1017232890-project-member] [instance: 5d8b3009-ba5e-4f29-81fc-6d389ec30808] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1500.077990] env[67899]: DEBUG nova.compute.manager [None req-3149d882-2ab5-4849-a2dc-744547851dfa tempest-ServerActionsTestOtherB-1017232890 tempest-ServerActionsTestOtherB-1017232890-project-member] [instance: 5d8b3009-ba5e-4f29-81fc-6d389ec30808] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1500.101029] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3149d882-2ab5-4849-a2dc-744547851dfa tempest-ServerActionsTestOtherB-1017232890 tempest-ServerActionsTestOtherB-1017232890-project-member] Lock "5d8b3009-ba5e-4f29-81fc-6d389ec30808" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.466s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.110434] env[67899]: DEBUG nova.compute.manager [None req-d9ba9f59-d540-457f-98ff-d4f3cc4ec481 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: a292a68e-deff-465b-81f0-727e75c2e212] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1500.134447] env[67899]: DEBUG nova.compute.manager [None req-d9ba9f59-d540-457f-98ff-d4f3cc4ec481 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: a292a68e-deff-465b-81f0-727e75c2e212] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1500.159543] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d9ba9f59-d540-457f-98ff-d4f3cc4ec481 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "a292a68e-deff-465b-81f0-727e75c2e212" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.723s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.175604] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1500.226271] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1500.226579] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.228088] env[67899]: INFO nova.compute.claims [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1500.260204] env[67899]: DEBUG nova.network.neutron [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Updating instance_info_cache with network_info: [{"id": "d989bd9c-71bc-401e-951f-522fbd4539f1", "address": "fa:16:3e:26:04:cb", "network": {"id": "87a2c8e7-d332-4028-bf39-1da2df6ff034", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c1aaa2970e964d7b86557399120d12c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd989bd9c-71", "ovs_interfaceid": "d989bd9c-71bc-401e-951f-522fbd4539f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1500.269269] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Releasing lock "refresh_cache-b9282eeb-09db-4138-a1f0-9e03828021b8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1500.269647] env[67899]: DEBUG nova.compute.manager [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1500.269834] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1500.270376] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8785f891-2b4e-4748-ae30-be716f4c22ed {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.282538] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23f1ed9c-53a8-4baf-8de1-b5d7a79353e2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.315200] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b9282eeb-09db-4138-a1f0-9e03828021b8 could not be found. [ 1500.315423] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1500.315596] env[67899]: INFO nova.compute.manager [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1500.315836] env[67899]: DEBUG oslo.service.loopingcall [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1500.316077] env[67899]: DEBUG nova.compute.manager [-] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1500.316175] env[67899]: DEBUG nova.network.neutron [-] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1500.409989] env[67899]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67899) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1500.410251] env[67899]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-8ecc7fd0-f8c7-47e7-93a6-1c187f910739'] [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1500.410842] env[67899]: ERROR oslo.service.loopingcall [ 1500.412237] env[67899]: ERROR nova.compute.manager [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1500.442956] env[67899]: ERROR nova.compute.manager [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Traceback (most recent call last): [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] exception_handler_v20(status_code, error_body) [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] raise client_exc(message=error_message, [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Neutron server returns request_ids: ['req-8ecc7fd0-f8c7-47e7-93a6-1c187f910739'] [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] During handling of the above exception, another exception occurred: [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Traceback (most recent call last): [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._delete_instance(context, instance, bdms) [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._shutdown_instance(context, instance, bdms) [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._try_deallocate_network(context, instance, requested_networks) [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] with excutils.save_and_reraise_exception(): [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self.force_reraise() [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] raise self.value [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] _deallocate_network_with_retries() [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return evt.wait() [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] result = hub.switch() [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.greenlet.switch() [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] result = func(*self.args, **self.kw) [ 1500.442956] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] result = f(*args, **kwargs) [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._deallocate_network( [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self.network_api.deallocate_for_instance( [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] data = neutron.list_ports(**search_opts) [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.list('ports', self.ports_path, retrieve_all, [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] for r in self._pagination(collection, path, **params): [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] res = self.get(path, params=params) [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.retry_request("GET", action, body=body, [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] return self.do_request(method, action, body=body, [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] ret = obj(*args, **kwargs) [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] self._handle_fault_response(status_code, replybody, resp) [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1500.444069] env[67899]: ERROR nova.compute.manager [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] [ 1500.460587] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d35d7524-2116-4c78-8579-fea009a13640 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.467825] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfa3525d-6c39-413c-97ac-c626c140cb93 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.475237] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.485s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.475237] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 325.454s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.475237] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] During sync_power_state the instance has a pending task (deleting). Skip. [ 1500.475430] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "b9282eeb-09db-4138-a1f0-9e03828021b8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.503421] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78a4cee4-34c8-4c80-81d9-5e095b3e3477 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.511774] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb33ea70-f3d0-4eee-a9ef-65600f2ca040 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.529446] env[67899]: DEBUG nova.compute.provider_tree [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1500.538154] env[67899]: DEBUG nova.scheduler.client.report [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1500.557443] env[67899]: INFO nova.compute.manager [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] [instance: b9282eeb-09db-4138-a1f0-9e03828021b8] Successfully reverted task state from None on failure for instance. [ 1500.560786] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.561309] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server [None req-d13baaa7-778b-4748-89cd-771e48d59c26 tempest-TenantUsagesTestJSON-377809165 tempest-TenantUsagesTestJSON-377809165-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-8ecc7fd0-f8c7-47e7-93a6-1c187f910739'] [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1500.564305] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1500.566776] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1500.569473] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1500.569473] env[67899]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1500.569473] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1500.569473] env[67899]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1500.569473] env[67899]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1500.569473] env[67899]: ERROR oslo_messaging.rpc.server [ 1500.596449] env[67899]: DEBUG nova.compute.utils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1500.597870] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1500.598053] env[67899]: DEBUG nova.network.neutron [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1500.608649] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1500.674014] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1500.685677] env[67899]: DEBUG nova.policy [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5206226ca404a07b10db199a6436504', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bdf895619b34412fb20488318e170d23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1500.702447] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1500.702447] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1500.702447] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1500.702447] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1500.702447] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1500.702447] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1500.702696] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1500.702696] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1500.703399] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1500.703399] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1500.703399] env[67899]: DEBUG nova.virt.hardware [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1500.707019] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8087e387-1873-4252-95c7-284af9ec3680 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.713180] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adf3eb45-daf6-4514-aa1c-77df6242a4d5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.091338] env[67899]: DEBUG nova.network.neutron [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Successfully created port: 0014e82b-759d-4aae-9c36-dfb3e2e4a268 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1501.649378] env[67899]: DEBUG nova.compute.manager [req-daa3ddd8-af99-41f4-a9af-59082e1b40f0 req-828280ab-8783-4dc2-a706-c147d3098920 service nova] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Received event network-vif-plugged-0014e82b-759d-4aae-9c36-dfb3e2e4a268 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1501.649680] env[67899]: DEBUG oslo_concurrency.lockutils [req-daa3ddd8-af99-41f4-a9af-59082e1b40f0 req-828280ab-8783-4dc2-a706-c147d3098920 service nova] Acquiring lock "e179db1d-ee0c-4f47-a958-40dd69209d26-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1501.649813] env[67899]: DEBUG oslo_concurrency.lockutils [req-daa3ddd8-af99-41f4-a9af-59082e1b40f0 req-828280ab-8783-4dc2-a706-c147d3098920 service nova] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1501.649983] env[67899]: DEBUG oslo_concurrency.lockutils [req-daa3ddd8-af99-41f4-a9af-59082e1b40f0 req-828280ab-8783-4dc2-a706-c147d3098920 service nova] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1501.650343] env[67899]: DEBUG nova.compute.manager [req-daa3ddd8-af99-41f4-a9af-59082e1b40f0 req-828280ab-8783-4dc2-a706-c147d3098920 service nova] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] No waiting events found dispatching network-vif-plugged-0014e82b-759d-4aae-9c36-dfb3e2e4a268 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1501.650544] env[67899]: WARNING nova.compute.manager [req-daa3ddd8-af99-41f4-a9af-59082e1b40f0 req-828280ab-8783-4dc2-a706-c147d3098920 service nova] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Received unexpected event network-vif-plugged-0014e82b-759d-4aae-9c36-dfb3e2e4a268 for instance with vm_state building and task_state spawning. [ 1501.730019] env[67899]: DEBUG nova.network.neutron [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Successfully updated port: 0014e82b-759d-4aae-9c36-dfb3e2e4a268 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1501.749423] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "refresh_cache-e179db1d-ee0c-4f47-a958-40dd69209d26" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1501.749507] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "refresh_cache-e179db1d-ee0c-4f47-a958-40dd69209d26" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1501.749660] env[67899]: DEBUG nova.network.neutron [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1501.808714] env[67899]: DEBUG nova.network.neutron [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1502.191124] env[67899]: DEBUG nova.network.neutron [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Updating instance_info_cache with network_info: [{"id": "0014e82b-759d-4aae-9c36-dfb3e2e4a268", "address": "fa:16:3e:c3:d2:3b", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0014e82b-75", "ovs_interfaceid": "0014e82b-759d-4aae-9c36-dfb3e2e4a268", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1502.202360] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "refresh_cache-e179db1d-ee0c-4f47-a958-40dd69209d26" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1502.202702] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Instance network_info: |[{"id": "0014e82b-759d-4aae-9c36-dfb3e2e4a268", "address": "fa:16:3e:c3:d2:3b", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0014e82b-75", "ovs_interfaceid": "0014e82b-759d-4aae-9c36-dfb3e2e4a268", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1502.203136] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c3:d2:3b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '357d2811-e990-4985-9f9e-b158d10d3699', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0014e82b-759d-4aae-9c36-dfb3e2e4a268', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1502.210530] env[67899]: DEBUG oslo.service.loopingcall [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1502.211036] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1502.211315] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9a398576-0892-44d4-8375-cbf013e3ef20 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1502.231918] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1502.231918] env[67899]: value = "task-3467972" [ 1502.231918] env[67899]: _type = "Task" [ 1502.231918] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1502.240986] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467972, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1502.741606] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467972, 'name': CreateVM_Task} progress is 99%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1503.242778] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467972, 'name': CreateVM_Task, 'duration_secs': 0.542477} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1503.242967] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1503.243658] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1503.243842] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1503.244153] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1503.244407] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-166b4ee4-3db6-45bf-ae15-2c78aea61de8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1503.248805] env[67899]: DEBUG oslo_vmware.api [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 1503.248805] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]528e2e58-7465-b3b6-cecc-6581c9042d6e" [ 1503.248805] env[67899]: _type = "Task" [ 1503.248805] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1503.257728] env[67899]: DEBUG oslo_vmware.api [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]528e2e58-7465-b3b6-cecc-6581c9042d6e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1503.674917] env[67899]: DEBUG nova.compute.manager [req-e245b3bd-1c16-49ae-9a18-2eef726102dd req-f004b548-99bc-4eac-8580-551ac4155ed9 service nova] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Received event network-changed-0014e82b-759d-4aae-9c36-dfb3e2e4a268 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1503.675132] env[67899]: DEBUG nova.compute.manager [req-e245b3bd-1c16-49ae-9a18-2eef726102dd req-f004b548-99bc-4eac-8580-551ac4155ed9 service nova] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Refreshing instance network info cache due to event network-changed-0014e82b-759d-4aae-9c36-dfb3e2e4a268. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1503.675350] env[67899]: DEBUG oslo_concurrency.lockutils [req-e245b3bd-1c16-49ae-9a18-2eef726102dd req-f004b548-99bc-4eac-8580-551ac4155ed9 service nova] Acquiring lock "refresh_cache-e179db1d-ee0c-4f47-a958-40dd69209d26" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1503.675494] env[67899]: DEBUG oslo_concurrency.lockutils [req-e245b3bd-1c16-49ae-9a18-2eef726102dd req-f004b548-99bc-4eac-8580-551ac4155ed9 service nova] Acquired lock "refresh_cache-e179db1d-ee0c-4f47-a958-40dd69209d26" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1503.675656] env[67899]: DEBUG nova.network.neutron [req-e245b3bd-1c16-49ae-9a18-2eef726102dd req-f004b548-99bc-4eac-8580-551ac4155ed9 service nova] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Refreshing network info cache for port 0014e82b-759d-4aae-9c36-dfb3e2e4a268 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1503.761212] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1503.761564] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1503.761802] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1503.961524] env[67899]: DEBUG nova.network.neutron [req-e245b3bd-1c16-49ae-9a18-2eef726102dd req-f004b548-99bc-4eac-8580-551ac4155ed9 service nova] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Updated VIF entry in instance network info cache for port 0014e82b-759d-4aae-9c36-dfb3e2e4a268. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1503.962764] env[67899]: DEBUG nova.network.neutron [req-e245b3bd-1c16-49ae-9a18-2eef726102dd req-f004b548-99bc-4eac-8580-551ac4155ed9 service nova] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Updating instance_info_cache with network_info: [{"id": "0014e82b-759d-4aae-9c36-dfb3e2e4a268", "address": "fa:16:3e:c3:d2:3b", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0014e82b-75", "ovs_interfaceid": "0014e82b-759d-4aae-9c36-dfb3e2e4a268", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1503.971998] env[67899]: DEBUG oslo_concurrency.lockutils [req-e245b3bd-1c16-49ae-9a18-2eef726102dd req-f004b548-99bc-4eac-8580-551ac4155ed9 service nova] Releasing lock "refresh_cache-e179db1d-ee0c-4f47-a958-40dd69209d26" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1522.612875] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1524.996739] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1524.997094] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1524.997137] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1525.019453] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.019619] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.019834] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.019974] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.020113] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.020237] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.020358] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.020479] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.020597] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.020710] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1525.020828] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1525.996516] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1526.991610] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1527.013289] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1527.997293] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1528.996927] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1528.997210] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1528.997394] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1528.997659] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1534.996621] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1535.007842] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1535.008117] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1535.008325] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1535.008535] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1535.009663] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25f7ebdc-2f25-47f7-8a3a-0e4334621f52 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.018521] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-536d2a43-b745-48f0-8d2c-21380b0a61ea {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.032419] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9e93a4c-f8ad-4311-85e8-30e7bb5b9e87 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.038681] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-234d210c-2d90-4ee4-85bc-a001f2be1564 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.071414] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180904MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1535.071565] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1535.071745] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1535.144551] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.144717] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.144843] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.144964] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.145095] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a82e877-8a39-4684-8b75-711b7bedddac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.145213] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.145325] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8a157747-34e2-48f7-bf21-d17810122954 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.145438] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.145549] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.145661] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e179db1d-ee0c-4f47-a958-40dd69209d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1535.156306] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1535.166199] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1535.178303] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c56980f8-68e2-4501-a6a9-b713b208f895 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1535.192230] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1535.201994] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cd4ae8d3-63d9-463d-9428-fa2c1e8d1679 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1535.202312] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1535.202484] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1535.385373] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bb21aa1-5b7d-4842-8632-66af33ca64b9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.393049] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1beed34-e5a7-4948-91be-0cbced75608f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.423234] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3472a26c-1154-4706-8cf6-94ea6f7e1b91 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.430307] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59d10d6e-5c98-4533-bf86-8c138d91aa02 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.442830] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1535.450625] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1535.463418] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1535.463592] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.392s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1547.180055] env[67899]: WARNING oslo_vmware.rw_handles [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1547.180055] env[67899]: ERROR oslo_vmware.rw_handles [ 1547.180739] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/cbc59dd1-3e20-4d1a-8828-f206a6048ae1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1547.182294] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1547.182532] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Copying Virtual Disk [datastore1] vmware_temp/cbc59dd1-3e20-4d1a-8828-f206a6048ae1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/cbc59dd1-3e20-4d1a-8828-f206a6048ae1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1547.182834] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8e0abe71-e1c6-4128-aec0-8fecd9b89d07 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.191560] env[67899]: DEBUG oslo_vmware.api [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Waiting for the task: (returnval){ [ 1547.191560] env[67899]: value = "task-3467973" [ 1547.191560] env[67899]: _type = "Task" [ 1547.191560] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1547.199515] env[67899]: DEBUG oslo_vmware.api [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Task: {'id': task-3467973, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1547.703530] env[67899]: DEBUG oslo_vmware.exceptions [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1547.703822] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1547.704399] env[67899]: ERROR nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1547.704399] env[67899]: Faults: ['InvalidArgument'] [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Traceback (most recent call last): [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] yield resources [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] self.driver.spawn(context, instance, image_meta, [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] self._fetch_image_if_missing(context, vi) [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] image_cache(vi, tmp_image_ds_loc) [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] vm_util.copy_virtual_disk( [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] session._wait_for_task(vmdk_copy_task) [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] return self.wait_for_task(task_ref) [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] return evt.wait() [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] result = hub.switch() [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] return self.greenlet.switch() [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] self.f(*self.args, **self.kw) [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] raise exceptions.translate_fault(task_info.error) [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Faults: ['InvalidArgument'] [ 1547.704399] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] [ 1547.705482] env[67899]: INFO nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Terminating instance [ 1547.706249] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1547.706456] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1547.706703] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e1a99cb0-591c-4aee-8e4e-88f645ef2a0e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.709006] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1547.709212] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1547.709951] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e145f1ad-2f34-48c1-969e-60751408c280 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.716733] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1547.716962] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1ae6e153-a5ce-42a1-9781-c16586cfa778 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.719045] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1547.719224] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1547.720176] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-780cf183-3397-49ae-97e5-422eab29573e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.724859] env[67899]: DEBUG oslo_vmware.api [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Waiting for the task: (returnval){ [ 1547.724859] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52682429-1cc4-4fe1-1590-9d91b8db91ff" [ 1547.724859] env[67899]: _type = "Task" [ 1547.724859] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1547.737499] env[67899]: DEBUG oslo_vmware.api [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52682429-1cc4-4fe1-1590-9d91b8db91ff, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1547.783720] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1547.783947] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1547.784147] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Deleting the datastore file [datastore1] 6c4977f7-c53d-4c96-9028-86d7561f0d0d {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1547.784418] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6ee25f64-66e1-4b10-a234-64ad1758a58d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.790343] env[67899]: DEBUG oslo_vmware.api [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Waiting for the task: (returnval){ [ 1547.790343] env[67899]: value = "task-3467975" [ 1547.790343] env[67899]: _type = "Task" [ 1547.790343] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1547.798419] env[67899]: DEBUG oslo_vmware.api [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Task: {'id': task-3467975, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1548.235567] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1548.235945] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Creating directory with path [datastore1] vmware_temp/5618d7cc-30d8-4304-927d-bd48b5b36d81/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1548.236072] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ffe151da-875a-4639-9dee-132437ebc7cd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.247058] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Created directory with path [datastore1] vmware_temp/5618d7cc-30d8-4304-927d-bd48b5b36d81/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1548.247260] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Fetch image to [datastore1] vmware_temp/5618d7cc-30d8-4304-927d-bd48b5b36d81/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1548.247434] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/5618d7cc-30d8-4304-927d-bd48b5b36d81/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1548.248215] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40740b50-f1dc-4036-933b-c6b608d6cda2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.254509] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c6085b7-ae9c-4ccf-8811-93fe77803969 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.263216] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abea1d3f-840c-4bb0-9a73-d8790deedce8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.296667] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1622c06b-bcbb-449b-ab44-6e5295efb50a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.303473] env[67899]: DEBUG oslo_vmware.api [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Task: {'id': task-3467975, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085435} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1548.304844] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1548.305046] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1548.305223] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1548.305396] env[67899]: INFO nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1548.307121] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-641a1f77-bc31-446e-a147-9f5e61a5f389 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.309064] env[67899]: DEBUG nova.compute.claims [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1548.309244] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1548.309453] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1548.330823] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1548.383969] env[67899]: DEBUG oslo_vmware.rw_handles [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5618d7cc-30d8-4304-927d-bd48b5b36d81/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1548.443063] env[67899]: DEBUG oslo_vmware.rw_handles [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1548.443252] env[67899]: DEBUG oslo_vmware.rw_handles [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5618d7cc-30d8-4304-927d-bd48b5b36d81/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1548.592927] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ec2fa55-12a8-458d-95d8-9fbad1d50bb1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.600819] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70aed11e-50e2-4f81-a854-fd8b23465f54 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.631100] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1012165-6f18-4cf7-a91f-a0384bea29b9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.638912] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b3b4366-ce36-49cf-aac3-3bb1afd70bdf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.652038] env[67899]: DEBUG nova.compute.provider_tree [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1548.663608] env[67899]: DEBUG nova.scheduler.client.report [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1548.682056] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.372s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1548.682437] env[67899]: ERROR nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1548.682437] env[67899]: Faults: ['InvalidArgument'] [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Traceback (most recent call last): [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] self.driver.spawn(context, instance, image_meta, [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] self._fetch_image_if_missing(context, vi) [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] image_cache(vi, tmp_image_ds_loc) [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] vm_util.copy_virtual_disk( [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] session._wait_for_task(vmdk_copy_task) [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] return self.wait_for_task(task_ref) [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] return evt.wait() [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] result = hub.switch() [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] return self.greenlet.switch() [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] self.f(*self.args, **self.kw) [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] raise exceptions.translate_fault(task_info.error) [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Faults: ['InvalidArgument'] [ 1548.682437] env[67899]: ERROR nova.compute.manager [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] [ 1548.683310] env[67899]: DEBUG nova.compute.utils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1548.685112] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Build of instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d was re-scheduled: A specified parameter was not correct: fileType [ 1548.685112] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1548.685112] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1548.685354] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1548.685389] env[67899]: DEBUG nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1548.685538] env[67899]: DEBUG nova.network.neutron [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1549.115600] env[67899]: DEBUG nova.network.neutron [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1549.125615] env[67899]: INFO nova.compute.manager [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Took 0.44 seconds to deallocate network for instance. [ 1549.214189] env[67899]: INFO nova.scheduler.client.report [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Deleted allocations for instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d [ 1549.235848] env[67899]: DEBUG oslo_concurrency.lockutils [None req-817f6f80-1cb7-455c-81df-97b9e82bf3f4 tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.919s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1549.236935] env[67899]: DEBUG oslo_concurrency.lockutils [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 435.124s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1549.237166] env[67899]: DEBUG oslo_concurrency.lockutils [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Acquiring lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1549.237957] env[67899]: DEBUG oslo_concurrency.lockutils [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1549.237957] env[67899]: DEBUG oslo_concurrency.lockutils [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1549.239462] env[67899]: INFO nova.compute.manager [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Terminating instance [ 1549.241142] env[67899]: DEBUG nova.compute.manager [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1549.241330] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1549.241790] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-aaaeced5-8ebc-4385-bf82-3e26327fccf9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.248247] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1549.253424] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-952161a6-3f91-4e6d-ba60-e1f708ae25e5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.282710] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6c4977f7-c53d-4c96-9028-86d7561f0d0d could not be found. [ 1549.282908] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1549.283096] env[67899]: INFO nova.compute.manager [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1549.283331] env[67899]: DEBUG oslo.service.loopingcall [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1549.285450] env[67899]: DEBUG nova.compute.manager [-] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1549.285553] env[67899]: DEBUG nova.network.neutron [-] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1549.299939] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1549.300196] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1549.301612] env[67899]: INFO nova.compute.claims [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1549.310649] env[67899]: DEBUG nova.network.neutron [-] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1549.322698] env[67899]: INFO nova.compute.manager [-] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] Took 0.04 seconds to deallocate network for instance. [ 1549.416723] env[67899]: DEBUG oslo_concurrency.lockutils [None req-5252463b-7de6-4981-a61b-250956fe0a1c tempest-ServerAddressesNegativeTestJSON-374254023 tempest-ServerAddressesNegativeTestJSON-374254023-project-member] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1549.417020] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 374.396s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1549.417217] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6c4977f7-c53d-4c96-9028-86d7561f0d0d] During sync_power_state the instance has a pending task (deleting). Skip. [ 1549.417384] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "6c4977f7-c53d-4c96-9028-86d7561f0d0d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1549.531199] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1695548-1437-436a-8afb-3e075d4dc5d1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.538687] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c6d52cb-cee8-46a9-ab88-c5c2e2b99d93 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.569689] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e88fad3-b127-48de-ad89-20619d7ba740 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.576735] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-659ab119-7f90-4b3b-9e7a-9b06d4535bdf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.589526] env[67899]: DEBUG nova.compute.provider_tree [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1549.599846] env[67899]: DEBUG nova.scheduler.client.report [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1549.613105] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1549.613590] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1549.647202] env[67899]: DEBUG nova.compute.utils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1549.648436] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1549.648610] env[67899]: DEBUG nova.network.neutron [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1549.657994] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1549.721841] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1549.748724] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1549.748959] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1549.749130] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1549.749313] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1549.749459] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1549.749621] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1549.749854] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1549.750031] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1549.750208] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1549.750370] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1549.750542] env[67899]: DEBUG nova.virt.hardware [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1549.751435] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e8166ba-1cae-4dbb-ab6f-5b9f3fc920b9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.759675] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ce48c11-bf53-4ed0-9b2f-27983547c72c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1549.986599] env[67899]: DEBUG nova.policy [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '453d1970bc284a2b82d64ecd7029b751', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '46629721d6034983bb0c3f58dcf6674c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1550.334465] env[67899]: DEBUG nova.network.neutron [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Successfully created port: 52319fa0-d2b0-4cc0-9560-dbed78c1dbf9 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1551.021582] env[67899]: DEBUG nova.network.neutron [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Successfully updated port: 52319fa0-d2b0-4cc0-9560-dbed78c1dbf9 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1551.033468] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "refresh_cache-addcc88a-6bb5-4a70-938e-49c0c79c8414" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1551.033624] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquired lock "refresh_cache-addcc88a-6bb5-4a70-938e-49c0c79c8414" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1551.033776] env[67899]: DEBUG nova.network.neutron [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1551.075367] env[67899]: DEBUG nova.network.neutron [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1551.184662] env[67899]: DEBUG nova.compute.manager [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Received event network-vif-plugged-52319fa0-d2b0-4cc0-9560-dbed78c1dbf9 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1551.184834] env[67899]: DEBUG oslo_concurrency.lockutils [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] Acquiring lock "addcc88a-6bb5-4a70-938e-49c0c79c8414-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1551.185059] env[67899]: DEBUG oslo_concurrency.lockutils [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1551.185226] env[67899]: DEBUG oslo_concurrency.lockutils [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1551.185402] env[67899]: DEBUG nova.compute.manager [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] No waiting events found dispatching network-vif-plugged-52319fa0-d2b0-4cc0-9560-dbed78c1dbf9 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1551.185564] env[67899]: WARNING nova.compute.manager [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Received unexpected event network-vif-plugged-52319fa0-d2b0-4cc0-9560-dbed78c1dbf9 for instance with vm_state building and task_state spawning. [ 1551.185720] env[67899]: DEBUG nova.compute.manager [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Received event network-changed-52319fa0-d2b0-4cc0-9560-dbed78c1dbf9 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1551.185871] env[67899]: DEBUG nova.compute.manager [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Refreshing instance network info cache due to event network-changed-52319fa0-d2b0-4cc0-9560-dbed78c1dbf9. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1551.186223] env[67899]: DEBUG oslo_concurrency.lockutils [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] Acquiring lock "refresh_cache-addcc88a-6bb5-4a70-938e-49c0c79c8414" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1551.306194] env[67899]: DEBUG nova.network.neutron [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Updating instance_info_cache with network_info: [{"id": "52319fa0-d2b0-4cc0-9560-dbed78c1dbf9", "address": "fa:16:3e:b4:9e:fa", "network": {"id": "9064d867-4c85-4a7c-911b-d14d3f805e8d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-150447479-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "46629721d6034983bb0c3f58dcf6674c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b98c49ac-0eb7-4311-aa8f-60581b2ce706", "external-id": "nsx-vlan-transportzone-184", "segmentation_id": 184, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap52319fa0-d2", "ovs_interfaceid": "52319fa0-d2b0-4cc0-9560-dbed78c1dbf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1551.316446] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Releasing lock "refresh_cache-addcc88a-6bb5-4a70-938e-49c0c79c8414" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1551.316720] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Instance network_info: |[{"id": "52319fa0-d2b0-4cc0-9560-dbed78c1dbf9", "address": "fa:16:3e:b4:9e:fa", "network": {"id": "9064d867-4c85-4a7c-911b-d14d3f805e8d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-150447479-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "46629721d6034983bb0c3f58dcf6674c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b98c49ac-0eb7-4311-aa8f-60581b2ce706", "external-id": "nsx-vlan-transportzone-184", "segmentation_id": 184, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap52319fa0-d2", "ovs_interfaceid": "52319fa0-d2b0-4cc0-9560-dbed78c1dbf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1551.317014] env[67899]: DEBUG oslo_concurrency.lockutils [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] Acquired lock "refresh_cache-addcc88a-6bb5-4a70-938e-49c0c79c8414" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1551.317196] env[67899]: DEBUG nova.network.neutron [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Refreshing network info cache for port 52319fa0-d2b0-4cc0-9560-dbed78c1dbf9 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1551.318209] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b4:9e:fa', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b98c49ac-0eb7-4311-aa8f-60581b2ce706', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '52319fa0-d2b0-4cc0-9560-dbed78c1dbf9', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1551.326190] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Creating folder: Project (46629721d6034983bb0c3f58dcf6674c). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1551.329060] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e9a5badc-ba1a-481f-8822-0a6a771f5dd4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.340646] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Created folder: Project (46629721d6034983bb0c3f58dcf6674c) in parent group-v692900. [ 1551.340953] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Creating folder: Instances. Parent ref: group-v692992. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1551.341070] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-47387356-1a5d-455e-aa5b-d6a48f031704 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.350282] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Created folder: Instances in parent group-v692992. [ 1551.350510] env[67899]: DEBUG oslo.service.loopingcall [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1551.350709] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1551.350908] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-efa9a7c2-44c3-48c7-ac98-0a7ef492a44e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.372011] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1551.372011] env[67899]: value = "task-3467978" [ 1551.372011] env[67899]: _type = "Task" [ 1551.372011] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1551.379415] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467978, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1551.601627] env[67899]: DEBUG nova.network.neutron [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Updated VIF entry in instance network info cache for port 52319fa0-d2b0-4cc0-9560-dbed78c1dbf9. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1551.602065] env[67899]: DEBUG nova.network.neutron [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Updating instance_info_cache with network_info: [{"id": "52319fa0-d2b0-4cc0-9560-dbed78c1dbf9", "address": "fa:16:3e:b4:9e:fa", "network": {"id": "9064d867-4c85-4a7c-911b-d14d3f805e8d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-150447479-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "46629721d6034983bb0c3f58dcf6674c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b98c49ac-0eb7-4311-aa8f-60581b2ce706", "external-id": "nsx-vlan-transportzone-184", "segmentation_id": 184, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap52319fa0-d2", "ovs_interfaceid": "52319fa0-d2b0-4cc0-9560-dbed78c1dbf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1551.614278] env[67899]: DEBUG oslo_concurrency.lockutils [req-1158da92-00e1-4aee-895b-d6d3e450d3b1 req-e8c225fb-acbd-4974-921f-7af210daa17e service nova] Releasing lock "refresh_cache-addcc88a-6bb5-4a70-938e-49c0c79c8414" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1551.883499] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467978, 'name': CreateVM_Task, 'duration_secs': 0.273066} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1551.883659] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1551.884361] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1551.884531] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1551.884861] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1551.885162] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-54b4a565-5fd9-4bc3-a783-c30e9bb8661c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1551.889472] env[67899]: DEBUG oslo_vmware.api [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Waiting for the task: (returnval){ [ 1551.889472] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5292a177-4631-345f-dfb9-f265dbadaa1f" [ 1551.889472] env[67899]: _type = "Task" [ 1551.889472] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1551.896951] env[67899]: DEBUG oslo_vmware.api [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5292a177-4631-345f-dfb9-f265dbadaa1f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1552.400249] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1552.400529] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1552.400700] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1584.458202] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1585.997040] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1585.997040] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1585.997040] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1586.018470] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.018639] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.018770] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.018896] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.019026] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.019150] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.019268] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.019385] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.019501] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.019617] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1586.019734] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1587.996602] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1588.996615] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1588.996897] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1588.998028] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1589.996359] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1589.997029] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1589.997029] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1595.996607] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1596.008896] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1596.009226] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1596.009409] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1596.009566] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1596.010712] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69704861-7162-4cd8-9a9a-3eefa70a6ae9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.019373] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5109827-db5c-40d0-b3eb-005ed7cdd2eb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.034395] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd0903d4-cd90-41de-b70c-b0e2636caa38 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.040693] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82d9c66f-acc7-42ce-beb0-c4270517561f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.069770] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180936MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1596.069901] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1596.070132] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1596.141290] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance ec826735-4cc4-4847-8750-c5480e62134a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.141458] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c7ad553b-2149-4211-aee3-057ea83069f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.141588] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.141711] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a82e877-8a39-4684-8b75-711b7bedddac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.141828] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.141945] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8a157747-34e2-48f7-bf21-d17810122954 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.142072] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.142190] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.142326] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e179db1d-ee0c-4f47-a958-40dd69209d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.142418] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1596.156976] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1596.171644] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c56980f8-68e2-4501-a6a9-b713b208f895 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1596.185036] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1596.216765] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cd4ae8d3-63d9-463d-9428-fa2c1e8d1679 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1596.217033] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1596.217177] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1596.405318] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-548323cc-ad9c-4079-8f28-1eb4c6c41dcb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.413175] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c9b0031-8121-4f62-8529-8ae9e00782d5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.444417] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34866b65-9370-4c6f-8f01-d9fc95eac462 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.452058] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51b55c4c-8aa2-43ef-87ab-286276c6d27a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.465125] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1596.473756] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1596.487092] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1596.487327] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.417s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.184964] env[67899]: WARNING oslo_vmware.rw_handles [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1597.184964] env[67899]: ERROR oslo_vmware.rw_handles [ 1597.184964] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/5618d7cc-30d8-4304-927d-bd48b5b36d81/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1597.187531] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1597.187794] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Copying Virtual Disk [datastore1] vmware_temp/5618d7cc-30d8-4304-927d-bd48b5b36d81/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/5618d7cc-30d8-4304-927d-bd48b5b36d81/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1597.188132] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-68959ee4-bba6-4fe8-8bd9-d9ca0cc594d7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.195876] env[67899]: DEBUG oslo_vmware.api [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Waiting for the task: (returnval){ [ 1597.195876] env[67899]: value = "task-3467979" [ 1597.195876] env[67899]: _type = "Task" [ 1597.195876] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1597.204243] env[67899]: DEBUG oslo_vmware.api [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Task: {'id': task-3467979, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1597.706079] env[67899]: DEBUG oslo_vmware.exceptions [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1597.707043] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1597.707043] env[67899]: ERROR nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1597.707043] env[67899]: Faults: ['InvalidArgument'] [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] Traceback (most recent call last): [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] yield resources [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] self.driver.spawn(context, instance, image_meta, [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] self._fetch_image_if_missing(context, vi) [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] image_cache(vi, tmp_image_ds_loc) [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] vm_util.copy_virtual_disk( [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] session._wait_for_task(vmdk_copy_task) [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] return self.wait_for_task(task_ref) [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] return evt.wait() [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] result = hub.switch() [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] return self.greenlet.switch() [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] self.f(*self.args, **self.kw) [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] raise exceptions.translate_fault(task_info.error) [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] Faults: ['InvalidArgument'] [ 1597.707043] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] [ 1597.708163] env[67899]: INFO nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Terminating instance [ 1597.708898] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1597.709030] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1597.709281] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-94479e9c-5182-443d-a942-50ffa51f408e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.711748] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1597.711831] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1597.712640] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aa2146e-78d7-4d38-9219-3d97f2dbc047 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.719816] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1597.719957] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ecfb0dd8-c388-4b81-8239-d3cf9268aa27 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.722385] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1597.722559] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1597.723550] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3cbc2c74-2e08-474b-a310-2c9b51ee8094 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.728611] env[67899]: DEBUG oslo_vmware.api [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Waiting for the task: (returnval){ [ 1597.728611] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52297def-9e6d-4d6d-3a76-7a5be2f1a47c" [ 1597.728611] env[67899]: _type = "Task" [ 1597.728611] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1597.736272] env[67899]: DEBUG oslo_vmware.api [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52297def-9e6d-4d6d-3a76-7a5be2f1a47c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1597.797724] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1597.797943] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1597.798136] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Deleting the datastore file [datastore1] ec826735-4cc4-4847-8750-c5480e62134a {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1597.798429] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-eeef3da3-7aef-408c-afbb-5dc93f05f3c6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.804753] env[67899]: DEBUG oslo_vmware.api [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Waiting for the task: (returnval){ [ 1597.804753] env[67899]: value = "task-3467981" [ 1597.804753] env[67899]: _type = "Task" [ 1597.804753] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1597.812983] env[67899]: DEBUG oslo_vmware.api [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Task: {'id': task-3467981, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1598.238982] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1598.239501] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Creating directory with path [datastore1] vmware_temp/6712c44d-737e-4ac8-a963-b49e8d074c58/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1598.239501] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7b18f9b1-786f-42c5-8392-273c88a3f108 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.251036] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Created directory with path [datastore1] vmware_temp/6712c44d-737e-4ac8-a963-b49e8d074c58/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1598.251172] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Fetch image to [datastore1] vmware_temp/6712c44d-737e-4ac8-a963-b49e8d074c58/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1598.251356] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/6712c44d-737e-4ac8-a963-b49e8d074c58/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1598.252117] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a72c9542-f9ca-4a6b-9f83-257e0693ad18 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.258709] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac762198-f518-400c-aeac-9621d4ab9a75 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.268992] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b7acc96-ab38-4473-b876-459b70ba19b1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.299066] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c86bcc2e-8b79-45ad-9184-5956c6100c7f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.304346] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f9149660-7565-40a4-826d-1d24a6805b76 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.313031] env[67899]: DEBUG oslo_vmware.api [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Task: {'id': task-3467981, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.096608} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1598.313276] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1598.313454] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1598.313655] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1598.313843] env[67899]: INFO nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1598.315950] env[67899]: DEBUG nova.compute.claims [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1598.316205] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1598.316464] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1598.326825] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1598.377449] env[67899]: DEBUG oslo_vmware.rw_handles [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6712c44d-737e-4ac8-a963-b49e8d074c58/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1598.448357] env[67899]: DEBUG oslo_vmware.rw_handles [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1598.448634] env[67899]: DEBUG oslo_vmware.rw_handles [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6712c44d-737e-4ac8-a963-b49e8d074c58/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1598.588772] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ba78467-1482-4ac7-8e57-d18aa388ad1d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.596398] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76fc91d6-d66c-4b68-9854-5b9e9de1273a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.626915] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03920315-d7a5-4d3b-9df8-24bcd000887f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.633447] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e43ea5b5-05d1-44ae-83a8-275f8634d7e5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.645997] env[67899]: DEBUG nova.compute.provider_tree [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1598.654304] env[67899]: DEBUG nova.scheduler.client.report [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1598.669901] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.353s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1598.670603] env[67899]: ERROR nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1598.670603] env[67899]: Faults: ['InvalidArgument'] [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] Traceback (most recent call last): [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] self.driver.spawn(context, instance, image_meta, [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] self._fetch_image_if_missing(context, vi) [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] image_cache(vi, tmp_image_ds_loc) [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] vm_util.copy_virtual_disk( [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] session._wait_for_task(vmdk_copy_task) [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] return self.wait_for_task(task_ref) [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] return evt.wait() [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] result = hub.switch() [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] return self.greenlet.switch() [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] self.f(*self.args, **self.kw) [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] raise exceptions.translate_fault(task_info.error) [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] Faults: ['InvalidArgument'] [ 1598.670603] env[67899]: ERROR nova.compute.manager [instance: ec826735-4cc4-4847-8750-c5480e62134a] [ 1598.671535] env[67899]: DEBUG nova.compute.utils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1598.673082] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Build of instance ec826735-4cc4-4847-8750-c5480e62134a was re-scheduled: A specified parameter was not correct: fileType [ 1598.673082] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1598.673479] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1598.673653] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1598.673823] env[67899]: DEBUG nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1598.673985] env[67899]: DEBUG nova.network.neutron [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1599.117474] env[67899]: DEBUG nova.network.neutron [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1599.131965] env[67899]: INFO nova.compute.manager [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Took 0.46 seconds to deallocate network for instance. [ 1599.226992] env[67899]: INFO nova.scheduler.client.report [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Deleted allocations for instance ec826735-4cc4-4847-8750-c5480e62134a [ 1599.250135] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d00a6b81-c8df-494b-a8a0-988649b40bae tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "ec826735-4cc4-4847-8750-c5480e62134a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 633.278s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1599.251771] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "ec826735-4cc4-4847-8750-c5480e62134a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 437.295s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1599.253041] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Acquiring lock "ec826735-4cc4-4847-8750-c5480e62134a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1599.253041] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "ec826735-4cc4-4847-8750-c5480e62134a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1599.253041] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "ec826735-4cc4-4847-8750-c5480e62134a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1599.254628] env[67899]: INFO nova.compute.manager [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Terminating instance [ 1599.256498] env[67899]: DEBUG nova.compute.manager [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1599.256687] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1599.257186] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-34c9ac7a-772d-4e46-80d8-203a1c96a6c0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.267352] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce287e06-d00e-4b1b-ad00-45004ddca4b4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.278169] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1599.300981] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ec826735-4cc4-4847-8750-c5480e62134a could not be found. [ 1599.300981] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1599.300981] env[67899]: INFO nova.compute.manager [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1599.300981] env[67899]: DEBUG oslo.service.loopingcall [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1599.300981] env[67899]: DEBUG nova.compute.manager [-] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1599.300981] env[67899]: DEBUG nova.network.neutron [-] [instance: ec826735-4cc4-4847-8750-c5480e62134a] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1599.329783] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1599.330056] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1599.331507] env[67899]: INFO nova.compute.claims [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1599.334789] env[67899]: DEBUG nova.network.neutron [-] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1599.342165] env[67899]: INFO nova.compute.manager [-] [instance: ec826735-4cc4-4847-8750-c5480e62134a] Took 0.04 seconds to deallocate network for instance. [ 1599.458286] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4776f33b-d108-4de6-b4ac-5203e1e14f45 tempest-ImagesOneServerTestJSON-886727362 tempest-ImagesOneServerTestJSON-886727362-project-member] Lock "ec826735-4cc4-4847-8750-c5480e62134a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.206s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1599.459990] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "ec826735-4cc4-4847-8750-c5480e62134a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 424.439s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1599.460289] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: ec826735-4cc4-4847-8750-c5480e62134a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1599.460499] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "ec826735-4cc4-4847-8750-c5480e62134a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1599.581186] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67977dce-ef96-49f0-98d7-e071100678ce {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.589344] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-104c567c-b684-4abe-a82f-0ed6f52e1305 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.619145] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f557838-7f3a-4f09-b2ca-763dad775acd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.625855] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ec9dd72-e006-4e47-b219-711075a4fb9c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.638304] env[67899]: DEBUG nova.compute.provider_tree [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1599.648260] env[67899]: DEBUG nova.scheduler.client.report [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1599.661706] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1599.662258] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1599.698121] env[67899]: DEBUG nova.compute.utils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1599.699356] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1599.699527] env[67899]: DEBUG nova.network.neutron [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1599.708457] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1599.764304] env[67899]: DEBUG nova.policy [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27cd4ea8990b48be8c1f2455a264a858', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '288109a7b3bf4e3a9628184485e4679b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1599.771826] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1599.797545] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1599.798073] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1599.798376] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1599.798713] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1599.799007] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1599.799313] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1599.799691] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1599.799991] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1599.800325] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1599.800637] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1599.800952] env[67899]: DEBUG nova.virt.hardware [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1599.802315] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-734bc362-f2d8-4bc9-8618-e502d75b8e5d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.813510] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a5aba07-f84e-4367-91f3-123a9b21fbc2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1600.268524] env[67899]: DEBUG nova.network.neutron [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Successfully created port: ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1600.950993] env[67899]: DEBUG nova.network.neutron [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Successfully updated port: ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1600.965058] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "refresh_cache-a6544af8-879d-4c45-bee4-8551b861fc66" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1600.965058] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired lock "refresh_cache-a6544af8-879d-4c45-bee4-8551b861fc66" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1600.965197] env[67899]: DEBUG nova.network.neutron [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1601.002959] env[67899]: DEBUG nova.network.neutron [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1601.159918] env[67899]: DEBUG nova.compute.manager [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Received event network-vif-plugged-ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1601.160282] env[67899]: DEBUG oslo_concurrency.lockutils [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] Acquiring lock "a6544af8-879d-4c45-bee4-8551b861fc66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1601.160531] env[67899]: DEBUG oslo_concurrency.lockutils [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] Lock "a6544af8-879d-4c45-bee4-8551b861fc66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1601.160770] env[67899]: DEBUG oslo_concurrency.lockutils [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] Lock "a6544af8-879d-4c45-bee4-8551b861fc66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1601.161018] env[67899]: DEBUG nova.compute.manager [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] No waiting events found dispatching network-vif-plugged-ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1601.161248] env[67899]: WARNING nova.compute.manager [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Received unexpected event network-vif-plugged-ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca for instance with vm_state building and task_state spawning. [ 1601.161429] env[67899]: DEBUG nova.compute.manager [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Received event network-changed-ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1601.161584] env[67899]: DEBUG nova.compute.manager [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Refreshing instance network info cache due to event network-changed-ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1601.161809] env[67899]: DEBUG oslo_concurrency.lockutils [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] Acquiring lock "refresh_cache-a6544af8-879d-4c45-bee4-8551b861fc66" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1601.167211] env[67899]: DEBUG nova.network.neutron [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Updating instance_info_cache with network_info: [{"id": "ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca", "address": "fa:16:3e:6f:6d:73", "network": {"id": "26d51439-fee0-42d9-ac79-0e886ae3cf6e", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1360921386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "288109a7b3bf4e3a9628184485e4679b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce3f7d29-2c", "ovs_interfaceid": "ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1601.180188] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Releasing lock "refresh_cache-a6544af8-879d-4c45-bee4-8551b861fc66" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1601.180523] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Instance network_info: |[{"id": "ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca", "address": "fa:16:3e:6f:6d:73", "network": {"id": "26d51439-fee0-42d9-ac79-0e886ae3cf6e", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1360921386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "288109a7b3bf4e3a9628184485e4679b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce3f7d29-2c", "ovs_interfaceid": "ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1601.180807] env[67899]: DEBUG oslo_concurrency.lockutils [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] Acquired lock "refresh_cache-a6544af8-879d-4c45-bee4-8551b861fc66" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1601.180982] env[67899]: DEBUG nova.network.neutron [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Refreshing network info cache for port ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1601.182086] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6f:6d:73', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1601.189552] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating folder: Project (288109a7b3bf4e3a9628184485e4679b). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1601.190464] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3a65c394-cab2-49b1-929b-7c4577179cb0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1601.203898] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Created folder: Project (288109a7b3bf4e3a9628184485e4679b) in parent group-v692900. [ 1601.203898] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating folder: Instances. Parent ref: group-v692995. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1601.204061] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bc4e8733-97f9-4d07-9102-dae0ad375f5f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1601.213315] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Created folder: Instances in parent group-v692995. [ 1601.213531] env[67899]: DEBUG oslo.service.loopingcall [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1601.213708] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1601.213895] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1e08e4db-1fb6-4252-9d29-bf06688c535f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1601.233885] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1601.233885] env[67899]: value = "task-3467984" [ 1601.233885] env[67899]: _type = "Task" [ 1601.233885] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1601.241168] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467984, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1601.483424] env[67899]: DEBUG nova.network.neutron [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Updated VIF entry in instance network info cache for port ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1601.483796] env[67899]: DEBUG nova.network.neutron [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Updating instance_info_cache with network_info: [{"id": "ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca", "address": "fa:16:3e:6f:6d:73", "network": {"id": "26d51439-fee0-42d9-ac79-0e886ae3cf6e", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1360921386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "288109a7b3bf4e3a9628184485e4679b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce3f7d29-2c", "ovs_interfaceid": "ce3f7d29-2ce0-4338-bfa7-c5705e1b77ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1601.495462] env[67899]: DEBUG oslo_concurrency.lockutils [req-dd9898ea-4e88-4721-92c4-72ae9234fe04 req-17c96e59-29d0-4ad0-9754-c7783ba5a2e6 service nova] Releasing lock "refresh_cache-a6544af8-879d-4c45-bee4-8551b861fc66" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1601.744746] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467984, 'name': CreateVM_Task, 'duration_secs': 0.322831} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1601.744881] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1601.745589] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1601.745768] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1601.746317] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1601.746317] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-446a2446-912a-4221-b353-47e88a928a9f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1601.750773] env[67899]: DEBUG oslo_vmware.api [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 1601.750773] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5262dae6-fa33-29b4-26b7-1d2c04da0913" [ 1601.750773] env[67899]: _type = "Task" [ 1601.750773] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1601.758893] env[67899]: DEBUG oslo_vmware.api [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5262dae6-fa33-29b4-26b7-1d2c04da0913, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1602.260965] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1602.261230] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1602.261445] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1620.229588] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquiring lock "e08f620d-63a0-45cb-99c6-d9d95c938b38" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1620.229908] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1632.570121] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "e179db1d-ee0c-4f47-a958-40dd69209d26" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1636.220122] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "addcc88a-6bb5-4a70-938e-49c0c79c8414" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1643.247014] env[67899]: DEBUG oslo_concurrency.lockutils [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "a6544af8-879d-4c45-bee4-8551b861fc66" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1646.482641] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1647.203070] env[67899]: WARNING oslo_vmware.rw_handles [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1647.203070] env[67899]: ERROR oslo_vmware.rw_handles [ 1647.203592] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/6712c44d-737e-4ac8-a963-b49e8d074c58/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1647.205357] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1647.205598] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Copying Virtual Disk [datastore1] vmware_temp/6712c44d-737e-4ac8-a963-b49e8d074c58/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/6712c44d-737e-4ac8-a963-b49e8d074c58/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1647.205870] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0ca64c79-5136-4391-93a6-7f30985a6522 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.213534] env[67899]: DEBUG oslo_vmware.api [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Waiting for the task: (returnval){ [ 1647.213534] env[67899]: value = "task-3467985" [ 1647.213534] env[67899]: _type = "Task" [ 1647.213534] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1647.222436] env[67899]: DEBUG oslo_vmware.api [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Task: {'id': task-3467985, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1647.724446] env[67899]: DEBUG oslo_vmware.exceptions [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1647.724742] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1647.725186] env[67899]: ERROR nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1647.725186] env[67899]: Faults: ['InvalidArgument'] [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Traceback (most recent call last): [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] yield resources [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self.driver.spawn(context, instance, image_meta, [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self._fetch_image_if_missing(context, vi) [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] image_cache(vi, tmp_image_ds_loc) [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] vm_util.copy_virtual_disk( [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] session._wait_for_task(vmdk_copy_task) [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return self.wait_for_task(task_ref) [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return evt.wait() [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] result = hub.switch() [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return self.greenlet.switch() [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self.f(*self.args, **self.kw) [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] raise exceptions.translate_fault(task_info.error) [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Faults: ['InvalidArgument'] [ 1647.725186] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] [ 1647.726211] env[67899]: INFO nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Terminating instance [ 1647.726995] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1647.727215] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1647.727728] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "refresh_cache-c7ad553b-2149-4211-aee3-057ea83069f5" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1647.727881] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquired lock "refresh_cache-c7ad553b-2149-4211-aee3-057ea83069f5" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1647.728050] env[67899]: DEBUG nova.network.neutron [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1647.728926] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e78034e9-7721-46ae-b8e0-1116a5f5a474 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.738484] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1647.738647] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1647.739618] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-60ba35dc-40dd-4f39-9443-92b68cdfdd12 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.745058] env[67899]: DEBUG oslo_vmware.api [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 1647.745058] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]523a984c-131a-0ce2-c6f7-bb885bb5fc32" [ 1647.745058] env[67899]: _type = "Task" [ 1647.745058] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1647.757609] env[67899]: DEBUG oslo_vmware.api [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]523a984c-131a-0ce2-c6f7-bb885bb5fc32, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1647.780712] env[67899]: DEBUG nova.network.neutron [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1647.858220] env[67899]: DEBUG nova.network.neutron [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1647.867049] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Releasing lock "refresh_cache-c7ad553b-2149-4211-aee3-057ea83069f5" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1647.867446] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1647.867637] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1647.868684] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-358f9550-3c49-4468-8f04-bc24a3820f19 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.876316] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1647.876525] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-699e82db-d461-44a8-bdf4-130ad40a8fae {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.906234] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1647.906471] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1647.906607] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Deleting the datastore file [datastore1] c7ad553b-2149-4211-aee3-057ea83069f5 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1647.906848] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f6d2f0f9-d6bb-456b-aff6-96d91a2748a6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.912657] env[67899]: DEBUG oslo_vmware.api [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Waiting for the task: (returnval){ [ 1647.912657] env[67899]: value = "task-3467987" [ 1647.912657] env[67899]: _type = "Task" [ 1647.912657] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1647.919886] env[67899]: DEBUG oslo_vmware.api [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Task: {'id': task-3467987, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1647.995980] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1647.996152] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1647.996256] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1648.017838] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.017983] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.018111] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.018238] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.018371] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.018526] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.018661] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.018781] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.018899] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.019022] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1648.019145] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1648.256073] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1648.256073] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating directory with path [datastore1] vmware_temp/354fe54e-5e6f-4f5f-b069-6921da9fa028/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1648.256073] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-67adff2d-aacb-4101-abd8-3a194721b55b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.267529] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Created directory with path [datastore1] vmware_temp/354fe54e-5e6f-4f5f-b069-6921da9fa028/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1648.267735] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Fetch image to [datastore1] vmware_temp/354fe54e-5e6f-4f5f-b069-6921da9fa028/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1648.267904] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/354fe54e-5e6f-4f5f-b069-6921da9fa028/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1648.268688] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc183d97-e086-4224-bb72-c7e9be78fb16 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.275289] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23c9ccfe-9e65-4fd6-b1d4-81066d86e462 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.284249] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d458a0f3-a084-488f-8682-f18f3d4d456a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.314743] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1c7b91f-e443-4784-8ce8-2b82b469b4c7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.319757] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-babf6728-1537-475e-b157-1e70fe02d34a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.339104] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1648.385227] env[67899]: DEBUG oslo_vmware.rw_handles [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/354fe54e-5e6f-4f5f-b069-6921da9fa028/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1648.444157] env[67899]: DEBUG oslo_vmware.rw_handles [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1648.444375] env[67899]: DEBUG oslo_vmware.rw_handles [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/354fe54e-5e6f-4f5f-b069-6921da9fa028/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1648.447949] env[67899]: DEBUG oslo_vmware.api [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Task: {'id': task-3467987, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.04222} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1648.448203] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1648.448397] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1648.448636] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1648.448899] env[67899]: INFO nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Took 0.58 seconds to destroy the instance on the hypervisor. [ 1648.449167] env[67899]: DEBUG oslo.service.loopingcall [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1648.449390] env[67899]: DEBUG nova.compute.manager [-] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network deallocation for instance since networking was not requested. {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1648.452825] env[67899]: DEBUG nova.compute.claims [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1648.453028] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1648.453259] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1648.666375] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e863113-6e7c-4ef9-9ac1-13a8031e2fcb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.673950] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0a48733-7985-4024-927d-95a56123aed1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.702544] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-230c3683-4056-446e-b77f-fba9a085a7ca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.709341] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4784722-e931-491c-a850-c92452393474 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.723347] env[67899]: DEBUG nova.compute.provider_tree [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1648.733714] env[67899]: DEBUG nova.scheduler.client.report [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1648.747842] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.295s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1648.748402] env[67899]: ERROR nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1648.748402] env[67899]: Faults: ['InvalidArgument'] [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Traceback (most recent call last): [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self.driver.spawn(context, instance, image_meta, [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self._fetch_image_if_missing(context, vi) [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] image_cache(vi, tmp_image_ds_loc) [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] vm_util.copy_virtual_disk( [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] session._wait_for_task(vmdk_copy_task) [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return self.wait_for_task(task_ref) [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return evt.wait() [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] result = hub.switch() [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return self.greenlet.switch() [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self.f(*self.args, **self.kw) [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] raise exceptions.translate_fault(task_info.error) [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Faults: ['InvalidArgument'] [ 1648.748402] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] [ 1648.749190] env[67899]: DEBUG nova.compute.utils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1648.750349] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Build of instance c7ad553b-2149-4211-aee3-057ea83069f5 was re-scheduled: A specified parameter was not correct: fileType [ 1648.750349] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1648.750748] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1648.750963] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "refresh_cache-c7ad553b-2149-4211-aee3-057ea83069f5" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1648.751120] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquired lock "refresh_cache-c7ad553b-2149-4211-aee3-057ea83069f5" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1648.751278] env[67899]: DEBUG nova.network.neutron [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1648.774430] env[67899]: DEBUG nova.network.neutron [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1648.831132] env[67899]: DEBUG nova.network.neutron [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1648.839739] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Releasing lock "refresh_cache-c7ad553b-2149-4211-aee3-057ea83069f5" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1648.839935] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1648.840131] env[67899]: DEBUG nova.compute.manager [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Skipping network deallocation for instance since networking was not requested. {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1648.922230] env[67899]: INFO nova.scheduler.client.report [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Deleted allocations for instance c7ad553b-2149-4211-aee3-057ea83069f5 [ 1648.941406] env[67899]: DEBUG oslo_concurrency.lockutils [None req-95565c18-9c93-4bdb-95f2-e13f15fb26fd tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "c7ad553b-2149-4211-aee3-057ea83069f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 636.138s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1648.942503] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "c7ad553b-2149-4211-aee3-057ea83069f5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.280s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1648.942722] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "c7ad553b-2149-4211-aee3-057ea83069f5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1648.942926] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "c7ad553b-2149-4211-aee3-057ea83069f5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1648.943102] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "c7ad553b-2149-4211-aee3-057ea83069f5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1648.945147] env[67899]: INFO nova.compute.manager [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Terminating instance [ 1648.946594] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquiring lock "refresh_cache-c7ad553b-2149-4211-aee3-057ea83069f5" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1648.946753] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Acquired lock "refresh_cache-c7ad553b-2149-4211-aee3-057ea83069f5" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1648.946917] env[67899]: DEBUG nova.network.neutron [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1648.953368] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c56980f8-68e2-4501-a6a9-b713b208f895] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1648.974736] env[67899]: DEBUG nova.network.neutron [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1648.988795] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c56980f8-68e2-4501-a6a9-b713b208f895] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1648.995726] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1649.009503] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "c56980f8-68e2-4501-a6a9-b713b208f895" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.187s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1649.018147] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1649.033129] env[67899]: DEBUG nova.network.neutron [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1649.040492] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Releasing lock "refresh_cache-c7ad553b-2149-4211-aee3-057ea83069f5" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1649.040874] env[67899]: DEBUG nova.compute.manager [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1649.041072] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1649.041573] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4aaa46f8-1930-4588-886f-0bf484f3541e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.053675] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ae51200-3986-4fc9-a657-a64a6e1c0dbd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.083944] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c7ad553b-2149-4211-aee3-057ea83069f5 could not be found. [ 1649.084144] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1649.084321] env[67899]: INFO nova.compute.manager [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1649.084550] env[67899]: DEBUG oslo.service.loopingcall [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1649.085397] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1649.085616] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1649.086989] env[67899]: INFO nova.compute.claims [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1649.089471] env[67899]: DEBUG nova.compute.manager [-] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1649.089674] env[67899]: DEBUG nova.network.neutron [-] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1649.204778] env[67899]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67899) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1649.204990] env[67899]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-5f69e53c-6ca1-4a25-b2ea-a564dcfb27dd'] [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1649.205509] env[67899]: ERROR oslo.service.loopingcall [ 1649.207631] env[67899]: ERROR nova.compute.manager [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1649.238710] env[67899]: ERROR nova.compute.manager [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Traceback (most recent call last): [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] ret = obj(*args, **kwargs) [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] exception_handler_v20(status_code, error_body) [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] raise client_exc(message=error_message, [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Neutron server returns request_ids: ['req-5f69e53c-6ca1-4a25-b2ea-a564dcfb27dd'] [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] During handling of the above exception, another exception occurred: [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Traceback (most recent call last): [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self._delete_instance(context, instance, bdms) [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self._shutdown_instance(context, instance, bdms) [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self._try_deallocate_network(context, instance, requested_networks) [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] with excutils.save_and_reraise_exception(): [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self.force_reraise() [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] raise self.value [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] _deallocate_network_with_retries() [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return evt.wait() [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] result = hub.switch() [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return self.greenlet.switch() [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] result = func(*self.args, **self.kw) [ 1649.238710] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] result = f(*args, **kwargs) [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self._deallocate_network( [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self.network_api.deallocate_for_instance( [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] data = neutron.list_ports(**search_opts) [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] ret = obj(*args, **kwargs) [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return self.list('ports', self.ports_path, retrieve_all, [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] ret = obj(*args, **kwargs) [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] for r in self._pagination(collection, path, **params): [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] res = self.get(path, params=params) [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] ret = obj(*args, **kwargs) [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return self.retry_request("GET", action, body=body, [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] ret = obj(*args, **kwargs) [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] return self.do_request(method, action, body=body, [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] ret = obj(*args, **kwargs) [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] self._handle_fault_response(status_code, replybody, resp) [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1649.240182] env[67899]: ERROR nova.compute.manager [instance: c7ad553b-2149-4211-aee3-057ea83069f5] [ 1649.266643] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Lock "c7ad553b-2149-4211-aee3-057ea83069f5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.324s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1649.277019] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7836f043-07be-4cae-b442-b9dad48a5cda {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.286907] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6665382f-a3ac-4e4d-932a-031f16549b37 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.319149] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6329c6fb-994f-46ae-983e-9e0e82649408 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.327228] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07a3a870-e519-4f51-b1a2-731e22269ec4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.340802] env[67899]: DEBUG nova.compute.provider_tree [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1649.344454] env[67899]: INFO nova.compute.manager [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] [instance: c7ad553b-2149-4211-aee3-057ea83069f5] Successfully reverted task state from None on failure for instance. [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server [None req-d53afc97-050f-4980-aec2-ea7bdb003d01 tempest-ServerShowV247Test-219304978 tempest-ServerShowV247Test-219304978-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-5f69e53c-6ca1-4a25-b2ea-a564dcfb27dd'] [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1649.347659] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1649.349602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1649.351633] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1649.351633] env[67899]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1649.351633] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1649.351633] env[67899]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1649.351633] env[67899]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1649.351633] env[67899]: ERROR oslo_messaging.rpc.server [ 1649.351633] env[67899]: DEBUG nova.scheduler.client.report [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1649.360146] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1649.360620] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1649.391498] env[67899]: DEBUG nova.compute.utils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1649.392663] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1649.392826] env[67899]: DEBUG nova.network.neutron [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1649.400763] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1649.447435] env[67899]: DEBUG nova.policy [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8f6e68af5f7147f9a8080d720a834a56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a6ddbe6f15c6436197b1b073170d78cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1649.464082] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1649.484599] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1649.484838] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1649.484993] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1649.485196] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1649.485343] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1649.485489] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1649.485697] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1649.485858] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1649.486040] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1649.486195] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1649.486368] env[67899]: DEBUG nova.virt.hardware [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1649.487220] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-beec8b87-92d1-4aa9-84d0-6cbbbd8fea94 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.496697] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2c40e85-2954-4d5a-83ff-ff5946ec5b67 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.807146] env[67899]: DEBUG nova.network.neutron [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Successfully created port: 2b2328de-7485-4700-87ac-7693a2fc7b2d {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1649.996142] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1649.996403] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1649.996568] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1649.996715] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1650.562337] env[67899]: DEBUG nova.compute.manager [req-3cffa733-3ec8-4f37-b64e-cfafccd201ed req-a572e73e-4b7f-4b44-90ba-f20a135c9c18 service nova] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Received event network-vif-plugged-2b2328de-7485-4700-87ac-7693a2fc7b2d {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1650.562561] env[67899]: DEBUG oslo_concurrency.lockutils [req-3cffa733-3ec8-4f37-b64e-cfafccd201ed req-a572e73e-4b7f-4b44-90ba-f20a135c9c18 service nova] Acquiring lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1650.562821] env[67899]: DEBUG oslo_concurrency.lockutils [req-3cffa733-3ec8-4f37-b64e-cfafccd201ed req-a572e73e-4b7f-4b44-90ba-f20a135c9c18 service nova] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1650.562994] env[67899]: DEBUG oslo_concurrency.lockutils [req-3cffa733-3ec8-4f37-b64e-cfafccd201ed req-a572e73e-4b7f-4b44-90ba-f20a135c9c18 service nova] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1650.563174] env[67899]: DEBUG nova.compute.manager [req-3cffa733-3ec8-4f37-b64e-cfafccd201ed req-a572e73e-4b7f-4b44-90ba-f20a135c9c18 service nova] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] No waiting events found dispatching network-vif-plugged-2b2328de-7485-4700-87ac-7693a2fc7b2d {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1650.563342] env[67899]: WARNING nova.compute.manager [req-3cffa733-3ec8-4f37-b64e-cfafccd201ed req-a572e73e-4b7f-4b44-90ba-f20a135c9c18 service nova] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Received unexpected event network-vif-plugged-2b2328de-7485-4700-87ac-7693a2fc7b2d for instance with vm_state building and task_state spawning. [ 1650.649044] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1650.658951] env[67899]: DEBUG nova.network.neutron [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Successfully updated port: 2b2328de-7485-4700-87ac-7693a2fc7b2d {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1650.670513] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "refresh_cache-9b4a7c14-84dc-4222-a758-3f8f10e23b7a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1650.670680] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "refresh_cache-9b4a7c14-84dc-4222-a758-3f8f10e23b7a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1650.670832] env[67899]: DEBUG nova.network.neutron [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1650.706487] env[67899]: DEBUG nova.network.neutron [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1650.903222] env[67899]: DEBUG nova.network.neutron [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Updating instance_info_cache with network_info: [{"id": "2b2328de-7485-4700-87ac-7693a2fc7b2d", "address": "fa:16:3e:7d:ba:e2", "network": {"id": "857be8e0-b3fa-4836-87d8-37b0af1d0354", "bridge": "br-int", "label": "tempest-ImagesTestJSON-566779850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a6ddbe6f15c6436197b1b073170d78cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b2328de-74", "ovs_interfaceid": "2b2328de-7485-4700-87ac-7693a2fc7b2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1650.916844] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "refresh_cache-9b4a7c14-84dc-4222-a758-3f8f10e23b7a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1650.917169] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Instance network_info: |[{"id": "2b2328de-7485-4700-87ac-7693a2fc7b2d", "address": "fa:16:3e:7d:ba:e2", "network": {"id": "857be8e0-b3fa-4836-87d8-37b0af1d0354", "bridge": "br-int", "label": "tempest-ImagesTestJSON-566779850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a6ddbe6f15c6436197b1b073170d78cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b2328de-74", "ovs_interfaceid": "2b2328de-7485-4700-87ac-7693a2fc7b2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1650.917555] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7d:ba:e2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '19598cc1-e105-4565-906a-09dde75e3fbe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2b2328de-7485-4700-87ac-7693a2fc7b2d', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1650.925336] env[67899]: DEBUG oslo.service.loopingcall [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1650.925824] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1650.926076] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-344c3b86-62d2-4831-84db-f5d7654424f0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1650.946222] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1650.946222] env[67899]: value = "task-3467988" [ 1650.946222] env[67899]: _type = "Task" [ 1650.946222] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1650.953754] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467988, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1651.456664] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467988, 'name': CreateVM_Task, 'duration_secs': 0.304638} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1651.456822] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1651.457417] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1651.457582] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1651.457893] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1651.458150] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2017cec2-cc84-452e-9d56-4a4d61236888 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.462722] env[67899]: DEBUG oslo_vmware.api [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 1651.462722] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]529d2328-5f4b-c6e6-86e3-36c04873028d" [ 1651.462722] env[67899]: _type = "Task" [ 1651.462722] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1651.470092] env[67899]: DEBUG oslo_vmware.api [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]529d2328-5f4b-c6e6-86e3-36c04873028d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1651.973290] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1651.973614] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1651.973727] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1651.992342] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1652.016542] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1652.016795] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1652.586952] env[67899]: DEBUG nova.compute.manager [req-0617d9f6-c786-478e-91eb-9da8fc440ed2 req-1ce891c6-0c84-44b1-9a66-0af546606de1 service nova] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Received event network-changed-2b2328de-7485-4700-87ac-7693a2fc7b2d {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1652.587118] env[67899]: DEBUG nova.compute.manager [req-0617d9f6-c786-478e-91eb-9da8fc440ed2 req-1ce891c6-0c84-44b1-9a66-0af546606de1 service nova] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Refreshing instance network info cache due to event network-changed-2b2328de-7485-4700-87ac-7693a2fc7b2d. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1652.587328] env[67899]: DEBUG oslo_concurrency.lockutils [req-0617d9f6-c786-478e-91eb-9da8fc440ed2 req-1ce891c6-0c84-44b1-9a66-0af546606de1 service nova] Acquiring lock "refresh_cache-9b4a7c14-84dc-4222-a758-3f8f10e23b7a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1652.587468] env[67899]: DEBUG oslo_concurrency.lockutils [req-0617d9f6-c786-478e-91eb-9da8fc440ed2 req-1ce891c6-0c84-44b1-9a66-0af546606de1 service nova] Acquired lock "refresh_cache-9b4a7c14-84dc-4222-a758-3f8f10e23b7a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1652.587625] env[67899]: DEBUG nova.network.neutron [req-0617d9f6-c786-478e-91eb-9da8fc440ed2 req-1ce891c6-0c84-44b1-9a66-0af546606de1 service nova] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Refreshing network info cache for port 2b2328de-7485-4700-87ac-7693a2fc7b2d {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1652.847585] env[67899]: DEBUG nova.network.neutron [req-0617d9f6-c786-478e-91eb-9da8fc440ed2 req-1ce891c6-0c84-44b1-9a66-0af546606de1 service nova] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Updated VIF entry in instance network info cache for port 2b2328de-7485-4700-87ac-7693a2fc7b2d. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1652.847952] env[67899]: DEBUG nova.network.neutron [req-0617d9f6-c786-478e-91eb-9da8fc440ed2 req-1ce891c6-0c84-44b1-9a66-0af546606de1 service nova] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Updating instance_info_cache with network_info: [{"id": "2b2328de-7485-4700-87ac-7693a2fc7b2d", "address": "fa:16:3e:7d:ba:e2", "network": {"id": "857be8e0-b3fa-4836-87d8-37b0af1d0354", "bridge": "br-int", "label": "tempest-ImagesTestJSON-566779850-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a6ddbe6f15c6436197b1b073170d78cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b2328de-74", "ovs_interfaceid": "2b2328de-7485-4700-87ac-7693a2fc7b2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1652.857193] env[67899]: DEBUG oslo_concurrency.lockutils [req-0617d9f6-c786-478e-91eb-9da8fc440ed2 req-1ce891c6-0c84-44b1-9a66-0af546606de1 service nova] Releasing lock "refresh_cache-9b4a7c14-84dc-4222-a758-3f8f10e23b7a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1655.996825] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1656.007607] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1656.007834] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1656.007997] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1656.008168] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1656.009288] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3c8e4ba-7a65-48fc-a370-f93bda0b70a5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.018081] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f20840be-d4d6-44c8-bb1a-fbd18c3d6c46 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.031684] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a1ef350-0fcb-470c-9129-8da0a33f9f09 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.037661] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1157c0cd-0517-4c57-8e20-b2d2c7062bb0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.065404] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180885MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1656.065547] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1656.065731] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1656.139076] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.139222] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a82e877-8a39-4684-8b75-711b7bedddac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.139350] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.139473] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8a157747-34e2-48f7-bf21-d17810122954 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.139592] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.139709] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.139822] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e179db1d-ee0c-4f47-a958-40dd69209d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.139934] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.140062] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.140176] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1656.150751] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cd4ae8d3-63d9-463d-9428-fa2c1e8d1679 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1656.160375] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e08f620d-63a0-45cb-99c6-d9d95c938b38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1656.160597] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1656.160750] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1656.298241] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e354e30-b476-4b88-9b45-77a62f491ff2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.306194] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d58e8fc0-1185-47ab-bb6c-ebfedaefaf16 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.336294] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ada28ab2-b04a-45fd-9631-ffe8738c4154 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.343694] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68500fa6-88dd-4588-9533-4a8da04c89eb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.363776] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1656.373589] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1656.390461] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1656.390781] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.325s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1680.698264] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1680.698603] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1696.473578] env[67899]: WARNING oslo_vmware.rw_handles [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1696.473578] env[67899]: ERROR oslo_vmware.rw_handles [ 1696.476038] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/354fe54e-5e6f-4f5f-b069-6921da9fa028/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1696.476134] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1696.476395] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Copying Virtual Disk [datastore1] vmware_temp/354fe54e-5e6f-4f5f-b069-6921da9fa028/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/354fe54e-5e6f-4f5f-b069-6921da9fa028/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1696.476751] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f359b43b-26e8-4ba6-9d0a-19cfe4addbc7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.485164] env[67899]: DEBUG oslo_vmware.api [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 1696.485164] env[67899]: value = "task-3467989" [ 1696.485164] env[67899]: _type = "Task" [ 1696.485164] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1696.492966] env[67899]: DEBUG oslo_vmware.api [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': task-3467989, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1696.995925] env[67899]: DEBUG oslo_vmware.exceptions [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1696.996160] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1696.996712] env[67899]: ERROR nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1696.996712] env[67899]: Faults: ['InvalidArgument'] [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Traceback (most recent call last): [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] yield resources [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] self.driver.spawn(context, instance, image_meta, [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] self._fetch_image_if_missing(context, vi) [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] image_cache(vi, tmp_image_ds_loc) [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] vm_util.copy_virtual_disk( [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] session._wait_for_task(vmdk_copy_task) [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] return self.wait_for_task(task_ref) [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] return evt.wait() [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] result = hub.switch() [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] return self.greenlet.switch() [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] self.f(*self.args, **self.kw) [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] raise exceptions.translate_fault(task_info.error) [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Faults: ['InvalidArgument'] [ 1696.996712] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] [ 1696.997631] env[67899]: INFO nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Terminating instance [ 1696.999051] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1696.999051] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1696.999051] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-75bca810-2aaf-47e8-a3ae-8258de23b333 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.001158] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1697.001348] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1697.002055] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe1a078f-3db5-4248-b03d-1035c04dec55 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.008944] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1697.009148] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-952f926d-a461-4217-bf2f-8613674c2748 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.011182] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1697.011355] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1697.012284] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f3d67662-cc9b-4b8a-83b8-1c702341ebf0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.016788] env[67899]: DEBUG oslo_vmware.api [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Waiting for the task: (returnval){ [ 1697.016788] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52aff850-038e-0125-92f2-72b1002cdb62" [ 1697.016788] env[67899]: _type = "Task" [ 1697.016788] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1697.029064] env[67899]: DEBUG oslo_vmware.api [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52aff850-038e-0125-92f2-72b1002cdb62, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1697.077662] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1697.077882] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1697.078177] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Deleting the datastore file [datastore1] 6fda2654-4579-4b9a-a97c-97e0128fff14 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1697.078329] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7d0c1187-029b-46c7-80b7-f4e3f5d5e7a5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.084057] env[67899]: DEBUG oslo_vmware.api [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 1697.084057] env[67899]: value = "task-3467991" [ 1697.084057] env[67899]: _type = "Task" [ 1697.084057] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1697.091330] env[67899]: DEBUG oslo_vmware.api [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': task-3467991, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1697.527118] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1697.527396] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Creating directory with path [datastore1] vmware_temp/4a886641-03f5-4fd4-aa60-d233d54d7293/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1697.527573] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f6871511-ea4f-4843-9b54-9eb8c3ce5449 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.539278] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Created directory with path [datastore1] vmware_temp/4a886641-03f5-4fd4-aa60-d233d54d7293/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1697.539464] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Fetch image to [datastore1] vmware_temp/4a886641-03f5-4fd4-aa60-d233d54d7293/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1697.539632] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/4a886641-03f5-4fd4-aa60-d233d54d7293/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1697.540362] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c43d19a7-5f4b-44c5-926d-78d2f0f83a1a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.546636] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b583c14-3db7-4734-bf00-7ffd89f3373f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.555520] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26a1995c-831b-4b8d-b716-e1f9568fda6c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.589219] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5475ce33-d913-4360-b22d-820fb3f2f6ff {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.596147] env[67899]: DEBUG oslo_vmware.api [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': task-3467991, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077528} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1697.597354] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1697.597543] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1697.597709] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1697.597877] env[67899]: INFO nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1697.599788] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-27f8ffd6-af9e-4797-916d-4b85ff53a8ce {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.601493] env[67899]: DEBUG nova.compute.claims [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1697.601658] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1697.601887] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1697.624966] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1697.678607] env[67899]: DEBUG oslo_vmware.rw_handles [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4a886641-03f5-4fd4-aa60-d233d54d7293/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1697.748468] env[67899]: DEBUG oslo_vmware.rw_handles [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1697.748689] env[67899]: DEBUG oslo_vmware.rw_handles [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4a886641-03f5-4fd4-aa60-d233d54d7293/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1697.862540] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80734abf-8dfc-47b8-bf4b-f708ac26514b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.870395] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3948a8ab-f44b-4678-9187-10f2d7c39f91 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.899212] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a75436d3-5ad0-45df-b3df-769ad3cc60a3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.906354] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f577810-052d-418e-a167-b10c9f56dcbf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.918930] env[67899]: DEBUG nova.compute.provider_tree [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1697.928095] env[67899]: DEBUG nova.scheduler.client.report [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1697.942613] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.341s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1697.943158] env[67899]: ERROR nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1697.943158] env[67899]: Faults: ['InvalidArgument'] [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Traceback (most recent call last): [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] self.driver.spawn(context, instance, image_meta, [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] self._fetch_image_if_missing(context, vi) [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] image_cache(vi, tmp_image_ds_loc) [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] vm_util.copy_virtual_disk( [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] session._wait_for_task(vmdk_copy_task) [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] return self.wait_for_task(task_ref) [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] return evt.wait() [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] result = hub.switch() [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] return self.greenlet.switch() [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] self.f(*self.args, **self.kw) [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] raise exceptions.translate_fault(task_info.error) [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Faults: ['InvalidArgument'] [ 1697.943158] env[67899]: ERROR nova.compute.manager [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] [ 1697.944151] env[67899]: DEBUG nova.compute.utils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1697.945276] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Build of instance 6fda2654-4579-4b9a-a97c-97e0128fff14 was re-scheduled: A specified parameter was not correct: fileType [ 1697.945276] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1697.945643] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1697.945812] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1697.945980] env[67899]: DEBUG nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1697.946157] env[67899]: DEBUG nova.network.neutron [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1698.199417] env[67899]: DEBUG nova.network.neutron [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1698.209708] env[67899]: INFO nova.compute.manager [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Took 0.26 seconds to deallocate network for instance. [ 1698.306244] env[67899]: INFO nova.scheduler.client.report [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Deleted allocations for instance 6fda2654-4579-4b9a-a97c-97e0128fff14 [ 1698.336350] env[67899]: DEBUG oslo_concurrency.lockutils [None req-b4663fb3-1872-4a4f-8f96-67cc37309cba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "6fda2654-4579-4b9a-a97c-97e0128fff14" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 636.815s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1698.337584] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "6fda2654-4579-4b9a-a97c-97e0128fff14" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.881s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1698.337860] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "6fda2654-4579-4b9a-a97c-97e0128fff14-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1698.338115] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "6fda2654-4579-4b9a-a97c-97e0128fff14-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1698.338387] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "6fda2654-4579-4b9a-a97c-97e0128fff14-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1698.340980] env[67899]: INFO nova.compute.manager [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Terminating instance [ 1698.342671] env[67899]: DEBUG nova.compute.manager [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1698.342862] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1698.343544] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4ab12396-80a7-42f5-9f42-560a64b0ee9d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.352033] env[67899]: DEBUG nova.compute.manager [None req-c8d97b42-c6d1-4386-8857-f061e595e961 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: cd4ae8d3-63d9-463d-9428-fa2c1e8d1679] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1698.358666] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b221597-5dd2-4866-906e-5a77d8359769 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.388769] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6fda2654-4579-4b9a-a97c-97e0128fff14 could not be found. [ 1698.388989] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1698.389183] env[67899]: INFO nova.compute.manager [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1698.389418] env[67899]: DEBUG oslo.service.loopingcall [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1698.389829] env[67899]: DEBUG nova.compute.manager [None req-c8d97b42-c6d1-4386-8857-f061e595e961 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: cd4ae8d3-63d9-463d-9428-fa2c1e8d1679] Instance disappeared before build. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1698.391111] env[67899]: DEBUG nova.compute.manager [-] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1698.391315] env[67899]: DEBUG nova.network.neutron [-] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1698.409664] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c8d97b42-c6d1-4386-8857-f061e595e961 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "cd4ae8d3-63d9-463d-9428-fa2c1e8d1679" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.600s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1698.413649] env[67899]: DEBUG nova.network.neutron [-] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1698.419795] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1698.423342] env[67899]: INFO nova.compute.manager [-] [instance: 6fda2654-4579-4b9a-a97c-97e0128fff14] Took 0.03 seconds to deallocate network for instance. [ 1698.471040] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1698.471362] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1698.472728] env[67899]: INFO nova.compute.claims [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1698.511170] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3abc834b-6c2f-4d2e-a4b5-b5e436cdb5dc tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "6fda2654-4579-4b9a-a97c-97e0128fff14" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1698.655924] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b4642a4-5b7d-4bf2-b5ae-5b05df180b00 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.663829] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5180db0-072f-4a07-bf7e-ef80853fbab5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.695681] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d18d0e95-ad48-47b1-854f-d214ca58d09f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.703755] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d2856ea-59d0-459a-b429-ea6a978a7044 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.716940] env[67899]: DEBUG nova.compute.provider_tree [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1698.726775] env[67899]: DEBUG nova.scheduler.client.report [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1698.742311] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1698.742824] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1698.777605] env[67899]: DEBUG nova.compute.utils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1698.779167] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1698.779571] env[67899]: DEBUG nova.network.neutron [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1698.787300] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1698.836112] env[67899]: DEBUG nova.policy [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21cba59607644f22982fcbcaeef05690', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eeb2fb9bab0547a798142e2144174ac8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1698.847536] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1698.872361] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1698.872678] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1698.872753] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1698.872930] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1698.873096] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1698.873248] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1698.873464] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1698.873624] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1698.873790] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1698.873955] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1698.874143] env[67899]: DEBUG nova.virt.hardware [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1698.874989] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10f4460c-4186-4b13-8795-3d316b7cf256 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.883298] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb8bf8f9-0903-4d1d-8c3a-9c84e1534bc1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1699.112947] env[67899]: DEBUG nova.network.neutron [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Successfully created port: 7314ab8b-4c7d-4667-87e3-2d92f1b0a646 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1699.932900] env[67899]: DEBUG nova.network.neutron [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Successfully updated port: 7314ab8b-4c7d-4667-87e3-2d92f1b0a646 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1699.944463] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquiring lock "refresh_cache-e08f620d-63a0-45cb-99c6-d9d95c938b38" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1699.944631] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquired lock "refresh_cache-e08f620d-63a0-45cb-99c6-d9d95c938b38" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1699.944766] env[67899]: DEBUG nova.network.neutron [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1699.983413] env[67899]: DEBUG nova.network.neutron [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1700.133139] env[67899]: DEBUG nova.network.neutron [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Updating instance_info_cache with network_info: [{"id": "7314ab8b-4c7d-4667-87e3-2d92f1b0a646", "address": "fa:16:3e:f6:ed:7e", "network": {"id": "da79c4d4-e5eb-440d-b258-c8c821e9a5ee", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1268348802-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eeb2fb9bab0547a798142e2144174ac8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7314ab8b-4c", "ovs_interfaceid": "7314ab8b-4c7d-4667-87e3-2d92f1b0a646", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1700.144462] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Releasing lock "refresh_cache-e08f620d-63a0-45cb-99c6-d9d95c938b38" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1700.144751] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Instance network_info: |[{"id": "7314ab8b-4c7d-4667-87e3-2d92f1b0a646", "address": "fa:16:3e:f6:ed:7e", "network": {"id": "da79c4d4-e5eb-440d-b258-c8c821e9a5ee", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1268348802-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eeb2fb9bab0547a798142e2144174ac8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7314ab8b-4c", "ovs_interfaceid": "7314ab8b-4c7d-4667-87e3-2d92f1b0a646", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1700.145162] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f6:ed:7e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0f096917-a0cf-4add-a9d2-23ca1c723b3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7314ab8b-4c7d-4667-87e3-2d92f1b0a646', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1700.152897] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Creating folder: Project (eeb2fb9bab0547a798142e2144174ac8). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1700.153457] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ec23ab4d-93c9-4022-a57a-98fb6d8ea7b3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.165555] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Created folder: Project (eeb2fb9bab0547a798142e2144174ac8) in parent group-v692900. [ 1700.165735] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Creating folder: Instances. Parent ref: group-v692999. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1700.165949] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1d148297-8ff0-4ea5-bc70-249b71d4f343 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.174633] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Created folder: Instances in parent group-v692999. [ 1700.174862] env[67899]: DEBUG oslo.service.loopingcall [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1700.175050] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1700.175241] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b2e83d79-5bd0-4b4a-adf1-718592960bab {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.193883] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1700.193883] env[67899]: value = "task-3467994" [ 1700.193883] env[67899]: _type = "Task" [ 1700.193883] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1700.201362] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467994, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1700.241871] env[67899]: DEBUG nova.compute.manager [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Received event network-vif-plugged-7314ab8b-4c7d-4667-87e3-2d92f1b0a646 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1700.242106] env[67899]: DEBUG oslo_concurrency.lockutils [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] Acquiring lock "e08f620d-63a0-45cb-99c6-d9d95c938b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1700.242355] env[67899]: DEBUG oslo_concurrency.lockutils [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1700.242531] env[67899]: DEBUG oslo_concurrency.lockutils [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1700.242706] env[67899]: DEBUG nova.compute.manager [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] No waiting events found dispatching network-vif-plugged-7314ab8b-4c7d-4667-87e3-2d92f1b0a646 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1700.242872] env[67899]: WARNING nova.compute.manager [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Received unexpected event network-vif-plugged-7314ab8b-4c7d-4667-87e3-2d92f1b0a646 for instance with vm_state building and task_state spawning. [ 1700.243173] env[67899]: DEBUG nova.compute.manager [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Received event network-changed-7314ab8b-4c7d-4667-87e3-2d92f1b0a646 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1700.243370] env[67899]: DEBUG nova.compute.manager [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Refreshing instance network info cache due to event network-changed-7314ab8b-4c7d-4667-87e3-2d92f1b0a646. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1700.243560] env[67899]: DEBUG oslo_concurrency.lockutils [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] Acquiring lock "refresh_cache-e08f620d-63a0-45cb-99c6-d9d95c938b38" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1700.243697] env[67899]: DEBUG oslo_concurrency.lockutils [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] Acquired lock "refresh_cache-e08f620d-63a0-45cb-99c6-d9d95c938b38" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1700.243850] env[67899]: DEBUG nova.network.neutron [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Refreshing network info cache for port 7314ab8b-4c7d-4667-87e3-2d92f1b0a646 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1700.524701] env[67899]: DEBUG nova.network.neutron [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Updated VIF entry in instance network info cache for port 7314ab8b-4c7d-4667-87e3-2d92f1b0a646. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1700.525077] env[67899]: DEBUG nova.network.neutron [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Updating instance_info_cache with network_info: [{"id": "7314ab8b-4c7d-4667-87e3-2d92f1b0a646", "address": "fa:16:3e:f6:ed:7e", "network": {"id": "da79c4d4-e5eb-440d-b258-c8c821e9a5ee", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1268348802-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eeb2fb9bab0547a798142e2144174ac8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7314ab8b-4c", "ovs_interfaceid": "7314ab8b-4c7d-4667-87e3-2d92f1b0a646", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1700.534986] env[67899]: DEBUG oslo_concurrency.lockutils [req-989209ed-6f3a-4f20-81b5-c5763253aaee req-fc37d23b-38f5-4e2d-9048-6a0ecbe8bc0e service nova] Releasing lock "refresh_cache-e08f620d-63a0-45cb-99c6-d9d95c938b38" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1700.704233] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467994, 'name': CreateVM_Task, 'duration_secs': 0.280466} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1700.704434] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1700.705030] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1700.705204] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1700.705537] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1700.705784] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ea1bdd83-cf2c-4c57-92e3-b8914a95c20f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1700.710438] env[67899]: DEBUG oslo_vmware.api [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Waiting for the task: (returnval){ [ 1700.710438] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52ee4ce7-fc5a-c952-be06-78b836558054" [ 1700.710438] env[67899]: _type = "Task" [ 1700.710438] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1700.717933] env[67899]: DEBUG oslo_vmware.api [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52ee4ce7-fc5a-c952-be06-78b836558054, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1701.221287] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1701.221650] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1701.221697] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1707.386615] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1709.996406] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1709.996685] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1709.996685] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1710.018418] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.018572] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.018705] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.018834] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.018993] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.019161] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.019286] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.019406] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.019526] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.019643] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1710.019764] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1710.020240] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1710.020427] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1710.996729] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1711.995951] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1711.996225] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1712.997020] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1712.997412] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1715.996324] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1716.007680] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1716.007903] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1716.008083] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1716.008245] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1716.009378] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaa54699-ded7-487a-af84-1c24d3387cbe {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1716.018026] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36b62c0f-cf6a-458b-bd79-4500efd6fc92 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1716.033889] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86259aa5-e90f-4f1b-89eb-5bbcc2b30382 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1716.040218] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da2febb-6884-4eeb-8c32-c78f087bec5a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1716.068388] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180937MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1716.068531] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1716.070046] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1716.137912] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 7a82e877-8a39-4684-8b75-711b7bedddac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.138093] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.138229] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 8a157747-34e2-48f7-bf21-d17810122954 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.138354] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.138466] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.138584] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e179db1d-ee0c-4f47-a958-40dd69209d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.138699] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.138812] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.138954] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.139089] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e08f620d-63a0-45cb-99c6-d9d95c938b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1716.149580] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1716.149796] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1716.149943] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1716.275394] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29f21954-6828-43f6-8efc-a4f76061a395 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1716.282710] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e55d3e5-fecb-415d-a2d5-5d68ffa7b2cc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1716.313434] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d4ff748-c447-423d-a088-e0e0ddf65760 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1716.320466] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef76bde6-9c7e-4e8b-82b0-4434678adc9b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1716.333552] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1716.341822] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1716.357191] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1716.357445] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.289s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.247209] env[67899]: WARNING oslo_vmware.rw_handles [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1747.247209] env[67899]: ERROR oslo_vmware.rw_handles [ 1747.247884] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/4a886641-03f5-4fd4-aa60-d233d54d7293/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1747.249451] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1747.249686] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Copying Virtual Disk [datastore1] vmware_temp/4a886641-03f5-4fd4-aa60-d233d54d7293/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/4a886641-03f5-4fd4-aa60-d233d54d7293/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1747.249968] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-135393ad-27e4-43b2-914a-45d5466b4ec2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.257894] env[67899]: DEBUG oslo_vmware.api [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Waiting for the task: (returnval){ [ 1747.257894] env[67899]: value = "task-3467995" [ 1747.257894] env[67899]: _type = "Task" [ 1747.257894] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1747.265139] env[67899]: DEBUG oslo_vmware.api [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Task: {'id': task-3467995, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1747.768901] env[67899]: DEBUG oslo_vmware.exceptions [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1747.769158] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1747.769695] env[67899]: ERROR nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1747.769695] env[67899]: Faults: ['InvalidArgument'] [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Traceback (most recent call last): [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] yield resources [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] self.driver.spawn(context, instance, image_meta, [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] self._fetch_image_if_missing(context, vi) [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] image_cache(vi, tmp_image_ds_loc) [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] vm_util.copy_virtual_disk( [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] session._wait_for_task(vmdk_copy_task) [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] return self.wait_for_task(task_ref) [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] return evt.wait() [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] result = hub.switch() [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] return self.greenlet.switch() [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] self.f(*self.args, **self.kw) [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] raise exceptions.translate_fault(task_info.error) [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Faults: ['InvalidArgument'] [ 1747.769695] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] [ 1747.771084] env[67899]: INFO nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Terminating instance [ 1747.771548] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1747.771747] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1747.771978] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-66b42c3d-834f-4051-ad12-5d6cbb2b4db7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.774102] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1747.774296] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1747.774978] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1dca22a-774e-429e-a8fa-d2a49eef75fb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.781323] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1747.781610] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-25e1232f-d574-42e1-bbec-8a63623d9877 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.783708] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1747.783877] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1747.784863] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-961cd013-bccd-4212-bede-81390b4a3a93 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.790684] env[67899]: DEBUG oslo_vmware.api [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Waiting for the task: (returnval){ [ 1747.790684] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]523a96f5-7427-8db4-f672-fd4d039836ab" [ 1747.790684] env[67899]: _type = "Task" [ 1747.790684] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1747.797738] env[67899]: DEBUG oslo_vmware.api [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]523a96f5-7427-8db4-f672-fd4d039836ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1747.849813] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1747.850046] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1747.850233] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Deleting the datastore file [datastore1] 7a82e877-8a39-4684-8b75-711b7bedddac {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1747.850495] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-34ac3ef8-c7b4-417b-af42-e7a7ab87af27 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.857592] env[67899]: DEBUG oslo_vmware.api [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Waiting for the task: (returnval){ [ 1747.857592] env[67899]: value = "task-3467997" [ 1747.857592] env[67899]: _type = "Task" [ 1747.857592] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1747.864910] env[67899]: DEBUG oslo_vmware.api [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Task: {'id': task-3467997, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1748.300457] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1748.300713] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Creating directory with path [datastore1] vmware_temp/955d6e1a-e789-44e5-9a78-ccb2bfd11b7e/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1748.300949] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7174b5ef-dec4-4941-92dd-1d7b9bb003c1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.312798] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Created directory with path [datastore1] vmware_temp/955d6e1a-e789-44e5-9a78-ccb2bfd11b7e/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1748.312990] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Fetch image to [datastore1] vmware_temp/955d6e1a-e789-44e5-9a78-ccb2bfd11b7e/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1748.313179] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/955d6e1a-e789-44e5-9a78-ccb2bfd11b7e/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1748.313906] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ac3bdc3-ec94-4d49-8e34-9056d60ab26d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.320308] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ce258fe-0293-4849-929e-22e2428e5fdb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.329110] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd6015cc-49f4-4144-9d9d-aaaf04fae717 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.363204] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d66cdf34-dce2-4765-b78e-ad932d87ba1b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.369888] env[67899]: DEBUG oslo_vmware.api [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Task: {'id': task-3467997, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083405} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1748.371268] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1748.371459] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1748.371633] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1748.371805] env[67899]: INFO nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1748.373534] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1e2665c5-5fc5-44fc-be00-6b921d69264b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.375347] env[67899]: DEBUG nova.compute.claims [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1748.375521] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1748.375733] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1748.397508] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1748.535469] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1748.536275] env[67899]: ERROR nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] result = getattr(controller, method)(*args, **kwargs) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._get(image_id) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] resp, body = self.http_client.get(url, headers=header) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.request(url, 'GET', **kwargs) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._handle_response(resp) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise exc.from_response(resp, resp.content) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] During handling of the above exception, another exception occurred: [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] yield resources [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self.driver.spawn(context, instance, image_meta, [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._fetch_image_if_missing(context, vi) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] image_fetch(context, vi, tmp_image_ds_loc) [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] images.fetch_image( [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1748.536275] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] metadata = IMAGE_API.get(context, image_ref) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return session.show(context, image_id, [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] _reraise_translated_image_exception(image_id) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise new_exc.with_traceback(exc_trace) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] result = getattr(controller, method)(*args, **kwargs) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._get(image_id) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] resp, body = self.http_client.get(url, headers=header) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.request(url, 'GET', **kwargs) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._handle_response(resp) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise exc.from_response(resp, resp.content) [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] nova.exception.ImageNotAuthorized: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1748.537414] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1748.537414] env[67899]: INFO nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Terminating instance [ 1748.538187] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1748.538344] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1748.538986] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1748.539197] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1748.541733] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-da278427-47ca-403a-b46f-7e32aecfaca3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.544511] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c62c633d-f780-4af3-8c53-25555a514c96 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.552111] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1748.552324] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2bbfdcef-e169-4b6d-b094-e3aa2d2003dc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.554630] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1748.554719] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1748.556025] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1e055f99-6d39-43cb-b256-234fde3de68c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.562766] env[67899]: DEBUG oslo_vmware.api [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Waiting for the task: (returnval){ [ 1748.562766] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]526801ce-25dc-c70f-433f-0fcb04e308fe" [ 1748.562766] env[67899]: _type = "Task" [ 1748.562766] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1748.570352] env[67899]: DEBUG oslo_vmware.api [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]526801ce-25dc-c70f-433f-0fcb04e308fe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1748.571787] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37a5184f-5be6-4cdd-8b1f-e069507d7a4b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.578034] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-006ff8a6-23c6-4f83-966e-d36661b8b10c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.608060] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d70ef8c-0354-4cc5-9bf5-a8d5d4d19691 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.615096] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28f6bbcf-ba9e-44da-b565-a5efee924205 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.620509] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1748.620710] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1748.620882] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Deleting the datastore file [datastore1] 8a157747-34e2-48f7-bf21-d17810122954 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1748.621484] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dcb1e8f5-3e83-4846-a98e-e30a15a4a79c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.630602] env[67899]: DEBUG nova.compute.provider_tree [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1748.635569] env[67899]: DEBUG oslo_vmware.api [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Waiting for the task: (returnval){ [ 1748.635569] env[67899]: value = "task-3467999" [ 1748.635569] env[67899]: _type = "Task" [ 1748.635569] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1748.639373] env[67899]: DEBUG nova.scheduler.client.report [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1748.646994] env[67899]: DEBUG oslo_vmware.api [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Task: {'id': task-3467999, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1748.655681] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.280s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1748.656265] env[67899]: ERROR nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1748.656265] env[67899]: Faults: ['InvalidArgument'] [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Traceback (most recent call last): [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] self.driver.spawn(context, instance, image_meta, [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] self._fetch_image_if_missing(context, vi) [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] image_cache(vi, tmp_image_ds_loc) [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] vm_util.copy_virtual_disk( [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] session._wait_for_task(vmdk_copy_task) [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] return self.wait_for_task(task_ref) [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] return evt.wait() [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] result = hub.switch() [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] return self.greenlet.switch() [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] self.f(*self.args, **self.kw) [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] raise exceptions.translate_fault(task_info.error) [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Faults: ['InvalidArgument'] [ 1748.656265] env[67899]: ERROR nova.compute.manager [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] [ 1748.657362] env[67899]: DEBUG nova.compute.utils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1748.658609] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Build of instance 7a82e877-8a39-4684-8b75-711b7bedddac was re-scheduled: A specified parameter was not correct: fileType [ 1748.658609] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1748.659046] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1748.659277] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1748.659487] env[67899]: DEBUG nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1748.659679] env[67899]: DEBUG nova.network.neutron [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1748.965184] env[67899]: DEBUG nova.network.neutron [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1748.977584] env[67899]: INFO nova.compute.manager [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Took 0.32 seconds to deallocate network for instance. [ 1749.073630] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1749.073630] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Creating directory with path [datastore1] vmware_temp/32177577-8bfa-4363-870d-123c6cc4a300/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1749.073767] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-13ae61a8-fcde-4d94-8908-142d93653adf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.077990] env[67899]: INFO nova.scheduler.client.report [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Deleted allocations for instance 7a82e877-8a39-4684-8b75-711b7bedddac [ 1749.086998] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Created directory with path [datastore1] vmware_temp/32177577-8bfa-4363-870d-123c6cc4a300/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1749.086998] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Fetch image to [datastore1] vmware_temp/32177577-8bfa-4363-870d-123c6cc4a300/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1749.086998] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/32177577-8bfa-4363-870d-123c6cc4a300/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1749.086998] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36f645fb-08f7-4f98-ace9-a3eb175fc8c8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.095551] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7844ce1f-34d2-4451-aa83-ce0a2e3ecfc9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.105037] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-795515de-8af9-400c-9eea-b8921b1f223d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.109435] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e1b9224c-7aca-4d45-b950-f391f0b228c2 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "7a82e877-8a39-4684-8b75-711b7bedddac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 580.761s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.110867] env[67899]: DEBUG oslo_concurrency.lockutils [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "7a82e877-8a39-4684-8b75-711b7bedddac" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 385.696s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1749.111107] env[67899]: DEBUG oslo_concurrency.lockutils [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Acquiring lock "7a82e877-8a39-4684-8b75-711b7bedddac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1749.111335] env[67899]: DEBUG oslo_concurrency.lockutils [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "7a82e877-8a39-4684-8b75-711b7bedddac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1749.111499] env[67899]: DEBUG oslo_concurrency.lockutils [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "7a82e877-8a39-4684-8b75-711b7bedddac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.113919] env[67899]: INFO nova.compute.manager [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Terminating instance [ 1749.140912] env[67899]: DEBUG nova.compute.manager [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1749.141133] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1749.141492] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1749.147186] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41256218-708b-4d0d-8724-9878b41333a6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.150282] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fd415ce0-9817-41d5-8fc9-fabcf049511a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.157684] env[67899]: DEBUG oslo_vmware.api [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Task: {'id': task-3467999, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076846} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1749.159685] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1749.159869] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1749.160049] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1749.160329] env[67899]: INFO nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1749.162240] env[67899]: DEBUG nova.compute.claims [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1749.162406] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1749.162573] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1749.167538] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33601aec-1fbf-43f0-97be-dc8c6dcc793a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.177920] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-92e2b7a5-3c0d-46c7-a420-893e72455250 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.200009] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7a82e877-8a39-4684-8b75-711b7bedddac could not be found. [ 1749.200295] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1749.200428] env[67899]: INFO nova.compute.manager [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Took 0.06 seconds to destroy the instance on the hypervisor. [ 1749.200689] env[67899]: DEBUG oslo.service.loopingcall [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1749.201819] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1749.203322] env[67899]: DEBUG nova.compute.manager [-] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1749.203433] env[67899]: DEBUG nova.network.neutron [-] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1749.205335] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1749.263135] env[67899]: DEBUG oslo_vmware.rw_handles [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32177577-8bfa-4363-870d-123c6cc4a300/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1749.326758] env[67899]: DEBUG oslo_vmware.rw_handles [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1749.327083] env[67899]: DEBUG oslo_vmware.rw_handles [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32177577-8bfa-4363-870d-123c6cc4a300/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1749.406162] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c04cfdd-7824-4496-b35f-71fcac76138b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.414132] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2986998f-7047-44c1-aef6-a8eaeb7ddb84 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.448416] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec88a0cf-c1df-4af4-97a4-9b69cd580b6d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.455756] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96f952da-eac9-473e-9c27-cabc5b6d5248 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.460820] env[67899]: DEBUG nova.network.neutron [-] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1749.471989] env[67899]: DEBUG nova.compute.provider_tree [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1749.473097] env[67899]: INFO nova.compute.manager [-] [instance: 7a82e877-8a39-4684-8b75-711b7bedddac] Took 0.27 seconds to deallocate network for instance. [ 1749.478792] env[67899]: DEBUG nova.scheduler.client.report [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1749.490865] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.328s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.491657] env[67899]: ERROR nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] result = getattr(controller, method)(*args, **kwargs) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._get(image_id) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] resp, body = self.http_client.get(url, headers=header) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.request(url, 'GET', **kwargs) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._handle_response(resp) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise exc.from_response(resp, resp.content) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] During handling of the above exception, another exception occurred: [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self.driver.spawn(context, instance, image_meta, [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._fetch_image_if_missing(context, vi) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] image_fetch(context, vi, tmp_image_ds_loc) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] images.fetch_image( [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] metadata = IMAGE_API.get(context, image_ref) [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1749.491657] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return session.show(context, image_id, [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] _reraise_translated_image_exception(image_id) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise new_exc.with_traceback(exc_trace) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] result = getattr(controller, method)(*args, **kwargs) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._get(image_id) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] resp, body = self.http_client.get(url, headers=header) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.request(url, 'GET', **kwargs) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._handle_response(resp) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise exc.from_response(resp, resp.content) [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] nova.exception.ImageNotAuthorized: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1749.492831] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.492831] env[67899]: DEBUG nova.compute.utils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1749.493578] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.292s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1749.494710] env[67899]: INFO nova.compute.claims [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1749.497410] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Build of instance 8a157747-34e2-48f7-bf21-d17810122954 was re-scheduled: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1749.497870] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1749.498076] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1749.498247] env[67899]: DEBUG nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1749.498416] env[67899]: DEBUG nova.network.neutron [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1749.582199] env[67899]: DEBUG oslo_concurrency.lockutils [None req-776b3004-f999-4da3-8d13-d0ff4fb667e0 tempest-ServersNegativeTestJSON-421907278 tempest-ServersNegativeTestJSON-421907278-project-member] Lock "7a82e877-8a39-4684-8b75-711b7bedddac" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.471s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.612616] env[67899]: DEBUG neutronclient.v2_0.client [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67899) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1749.613896] env[67899]: ERROR nova.compute.manager [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] result = getattr(controller, method)(*args, **kwargs) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._get(image_id) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] resp, body = self.http_client.get(url, headers=header) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.request(url, 'GET', **kwargs) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._handle_response(resp) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise exc.from_response(resp, resp.content) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] During handling of the above exception, another exception occurred: [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self.driver.spawn(context, instance, image_meta, [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._fetch_image_if_missing(context, vi) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] image_fetch(context, vi, tmp_image_ds_loc) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] images.fetch_image( [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] metadata = IMAGE_API.get(context, image_ref) [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1749.613896] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return session.show(context, image_id, [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] _reraise_translated_image_exception(image_id) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise new_exc.with_traceback(exc_trace) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] result = getattr(controller, method)(*args, **kwargs) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._get(image_id) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] resp, body = self.http_client.get(url, headers=header) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.request(url, 'GET', **kwargs) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self._handle_response(resp) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise exc.from_response(resp, resp.content) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] nova.exception.ImageNotAuthorized: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] During handling of the above exception, another exception occurred: [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._build_and_run_instance(context, instance, image, [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise exception.RescheduledException( [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] nova.exception.RescheduledException: Build of instance 8a157747-34e2-48f7-bf21-d17810122954 was re-scheduled: Not authorized for image c655a05a-4a40-4b3f-b609-3ba8116ad90f. [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] During handling of the above exception, another exception occurred: [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] exception_handler_v20(status_code, error_body) [ 1749.615031] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise client_exc(message=error_message, [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Neutron server returns request_ids: ['req-2c924690-1053-4b07-b264-c578b52ff2a7'] [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] During handling of the above exception, another exception occurred: [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._deallocate_network(context, instance, requested_networks) [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self.network_api.deallocate_for_instance( [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] data = neutron.list_ports(**search_opts) [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.list('ports', self.ports_path, retrieve_all, [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] for r in self._pagination(collection, path, **params): [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] res = self.get(path, params=params) [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.retry_request("GET", action, body=body, [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.do_request(method, action, body=body, [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._handle_fault_response(status_code, replybody, resp) [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise exception.Unauthorized() [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] nova.exception.Unauthorized: Not authorized. [ 1749.616198] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.689547] env[67899]: INFO nova.scheduler.client.report [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Deleted allocations for instance 8a157747-34e2-48f7-bf21-d17810122954 [ 1749.698261] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-210ebe17-ade2-483c-ac9f-b6c9c20898c3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.709161] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c82cc9b0-cd43-4644-87ee-0d72bd91b47f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.714421] env[67899]: DEBUG oslo_concurrency.lockutils [None req-93cc0406-1f51-47a1-9589-0404788b4e93 tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "8a157747-34e2-48f7-bf21-d17810122954" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 525.859s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.740658] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "8a157747-34e2-48f7-bf21-d17810122954" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 330.009s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1749.740907] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Acquiring lock "8a157747-34e2-48f7-bf21-d17810122954-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1749.741132] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "8a157747-34e2-48f7-bf21-d17810122954-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1749.741326] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "8a157747-34e2-48f7-bf21-d17810122954-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.743333] env[67899]: INFO nova.compute.manager [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Terminating instance [ 1749.745046] env[67899]: DEBUG nova.compute.manager [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1749.745244] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1749.745960] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2b6dbd81-bc6a-4660-8a33-e060bf2e9d3e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.748710] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-661774d5-abfd-4dd7-9a7e-b6812ab63528 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.757031] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b8ffd3d-94c3-40aa-9a07-77995daa96c5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.763769] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f6a652c-99e8-4612-9326-e543c3560ab7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.783164] env[67899]: DEBUG nova.compute.provider_tree [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1749.794170] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8a157747-34e2-48f7-bf21-d17810122954 could not be found. [ 1749.794370] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1749.794547] env[67899]: INFO nova.compute.manager [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1749.794784] env[67899]: DEBUG oslo.service.loopingcall [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1749.795563] env[67899]: DEBUG nova.scheduler.client.report [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1749.798654] env[67899]: DEBUG nova.compute.manager [-] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1749.798881] env[67899]: DEBUG nova.network.neutron [-] [instance: 8a157747-34e2-48f7-bf21-d17810122954] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1749.810834] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.811363] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1749.848691] env[67899]: DEBUG nova.compute.utils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1749.850367] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1749.850637] env[67899]: DEBUG nova.network.neutron [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1749.860997] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1749.907572] env[67899]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67899) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1749.907812] env[67899]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-2900cc81-55fe-46f6-8640-15c525e80151'] [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1749.908339] env[67899]: ERROR oslo.service.loopingcall [ 1749.910051] env[67899]: ERROR nova.compute.manager [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1749.925020] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1749.937545] env[67899]: ERROR nova.compute.manager [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] exception_handler_v20(status_code, error_body) [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise client_exc(message=error_message, [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Neutron server returns request_ids: ['req-2900cc81-55fe-46f6-8640-15c525e80151'] [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] During handling of the above exception, another exception occurred: [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] Traceback (most recent call last): [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._delete_instance(context, instance, bdms) [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._shutdown_instance(context, instance, bdms) [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._try_deallocate_network(context, instance, requested_networks) [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] with excutils.save_and_reraise_exception(): [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self.force_reraise() [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise self.value [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] _deallocate_network_with_retries() [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return evt.wait() [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] result = hub.switch() [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.greenlet.switch() [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] result = func(*self.args, **self.kw) [ 1749.937545] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] result = f(*args, **kwargs) [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._deallocate_network( [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self.network_api.deallocate_for_instance( [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] data = neutron.list_ports(**search_opts) [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.list('ports', self.ports_path, retrieve_all, [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] for r in self._pagination(collection, path, **params): [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] res = self.get(path, params=params) [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.retry_request("GET", action, body=body, [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] return self.do_request(method, action, body=body, [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] ret = obj(*args, **kwargs) [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] self._handle_fault_response(status_code, replybody, resp) [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1749.938972] env[67899]: ERROR nova.compute.manager [instance: 8a157747-34e2-48f7-bf21-d17810122954] [ 1749.941940] env[67899]: DEBUG nova.policy [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '061d2e2c56824c0886656625babbf20f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93f5a8c99daa4c85bd8edffb5c6dd338', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1749.953704] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1749.953936] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1749.954103] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1749.954287] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1749.954433] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1749.954577] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1749.954781] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1749.954940] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1749.955135] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1749.955329] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1749.955519] env[67899]: DEBUG nova.virt.hardware [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1749.956411] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d928674-5549-4872-ba4d-e5978b6390c3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.964369] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d59cc6f-5627-4181-9e5f-a06f54d45a0b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.970862] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Lock "8a157747-34e2-48f7-bf21-d17810122954" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.230s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1750.023989] env[67899]: INFO nova.compute.manager [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] [instance: 8a157747-34e2-48f7-bf21-d17810122954] Successfully reverted task state from None on failure for instance. [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server [None req-ece956f7-ae8e-482f-a329-67ac356bc66c tempest-MigrationsAdminTest-102068815 tempest-MigrationsAdminTest-102068815-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-2900cc81-55fe-46f6-8640-15c525e80151'] [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1750.027811] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1750.029602] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1750.031404] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1750.031404] env[67899]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1750.031404] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1750.031404] env[67899]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1750.031404] env[67899]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1750.031404] env[67899]: ERROR oslo_messaging.rpc.server [ 1750.231712] env[67899]: DEBUG nova.network.neutron [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Successfully created port: a6d9f354-8142-4a32-8eb1-0ab4a2193383 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1750.965146] env[67899]: DEBUG nova.network.neutron [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Successfully updated port: a6d9f354-8142-4a32-8eb1-0ab4a2193383 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1750.975264] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "refresh_cache-77ac61b9-48cc-4ae8-81e7-273841f7b42d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1750.975413] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquired lock "refresh_cache-77ac61b9-48cc-4ae8-81e7-273841f7b42d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1750.975561] env[67899]: DEBUG nova.network.neutron [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1751.010718] env[67899]: DEBUG nova.compute.manager [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Received event network-vif-plugged-a6d9f354-8142-4a32-8eb1-0ab4a2193383 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1751.010943] env[67899]: DEBUG oslo_concurrency.lockutils [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] Acquiring lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1751.011162] env[67899]: DEBUG oslo_concurrency.lockutils [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1751.011524] env[67899]: DEBUG oslo_concurrency.lockutils [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1751.011733] env[67899]: DEBUG nova.compute.manager [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] No waiting events found dispatching network-vif-plugged-a6d9f354-8142-4a32-8eb1-0ab4a2193383 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1751.011908] env[67899]: WARNING nova.compute.manager [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Received unexpected event network-vif-plugged-a6d9f354-8142-4a32-8eb1-0ab4a2193383 for instance with vm_state building and task_state spawning. [ 1751.012087] env[67899]: DEBUG nova.compute.manager [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Received event network-changed-a6d9f354-8142-4a32-8eb1-0ab4a2193383 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1751.012276] env[67899]: DEBUG nova.compute.manager [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Refreshing instance network info cache due to event network-changed-a6d9f354-8142-4a32-8eb1-0ab4a2193383. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1751.012801] env[67899]: DEBUG oslo_concurrency.lockutils [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] Acquiring lock "refresh_cache-77ac61b9-48cc-4ae8-81e7-273841f7b42d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1751.035730] env[67899]: DEBUG nova.network.neutron [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1751.186992] env[67899]: DEBUG nova.network.neutron [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Updating instance_info_cache with network_info: [{"id": "a6d9f354-8142-4a32-8eb1-0ab4a2193383", "address": "fa:16:3e:2c:2f:10", "network": {"id": "9f5d5406-2587-426e-93b4-3d172c8ac117", "bridge": "br-int", "label": "tempest-ServersTestJSON-457264438-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "93f5a8c99daa4c85bd8edffb5c6dd338", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "abcf0d10-3f3f-45dc-923e-1c78766e2dad", "external-id": "nsx-vlan-transportzone-405", "segmentation_id": 405, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa6d9f354-81", "ovs_interfaceid": "a6d9f354-8142-4a32-8eb1-0ab4a2193383", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1751.197020] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Releasing lock "refresh_cache-77ac61b9-48cc-4ae8-81e7-273841f7b42d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1751.197318] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Instance network_info: |[{"id": "a6d9f354-8142-4a32-8eb1-0ab4a2193383", "address": "fa:16:3e:2c:2f:10", "network": {"id": "9f5d5406-2587-426e-93b4-3d172c8ac117", "bridge": "br-int", "label": "tempest-ServersTestJSON-457264438-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "93f5a8c99daa4c85bd8edffb5c6dd338", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "abcf0d10-3f3f-45dc-923e-1c78766e2dad", "external-id": "nsx-vlan-transportzone-405", "segmentation_id": 405, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa6d9f354-81", "ovs_interfaceid": "a6d9f354-8142-4a32-8eb1-0ab4a2193383", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1751.197633] env[67899]: DEBUG oslo_concurrency.lockutils [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] Acquired lock "refresh_cache-77ac61b9-48cc-4ae8-81e7-273841f7b42d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1751.197813] env[67899]: DEBUG nova.network.neutron [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Refreshing network info cache for port a6d9f354-8142-4a32-8eb1-0ab4a2193383 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1751.199164] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2c:2f:10', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'abcf0d10-3f3f-45dc-923e-1c78766e2dad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a6d9f354-8142-4a32-8eb1-0ab4a2193383', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1751.206077] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Creating folder: Project (93f5a8c99daa4c85bd8edffb5c6dd338). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1751.206968] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e1ff1af0-fd05-4b81-b1dc-e70aed297de3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.220451] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Created folder: Project (93f5a8c99daa4c85bd8edffb5c6dd338) in parent group-v692900. [ 1751.220641] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Creating folder: Instances. Parent ref: group-v693002. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1751.220873] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ec767d5-eece-40fb-9ab0-e1af624d78c4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.230032] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Created folder: Instances in parent group-v693002. [ 1751.230032] env[67899]: DEBUG oslo.service.loopingcall [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1751.230215] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1751.230285] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8239f838-c3bc-4f27-90f3-bae8699fc3f2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.251623] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1751.251623] env[67899]: value = "task-3468002" [ 1751.251623] env[67899]: _type = "Task" [ 1751.251623] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1751.259811] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468002, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1751.452551] env[67899]: DEBUG nova.network.neutron [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Updated VIF entry in instance network info cache for port a6d9f354-8142-4a32-8eb1-0ab4a2193383. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1751.452954] env[67899]: DEBUG nova.network.neutron [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Updating instance_info_cache with network_info: [{"id": "a6d9f354-8142-4a32-8eb1-0ab4a2193383", "address": "fa:16:3e:2c:2f:10", "network": {"id": "9f5d5406-2587-426e-93b4-3d172c8ac117", "bridge": "br-int", "label": "tempest-ServersTestJSON-457264438-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "93f5a8c99daa4c85bd8edffb5c6dd338", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "abcf0d10-3f3f-45dc-923e-1c78766e2dad", "external-id": "nsx-vlan-transportzone-405", "segmentation_id": 405, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa6d9f354-81", "ovs_interfaceid": "a6d9f354-8142-4a32-8eb1-0ab4a2193383", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1751.463408] env[67899]: DEBUG oslo_concurrency.lockutils [req-e1feffcf-f67b-4ef6-8d12-6f96cc6d17bb req-338b7b83-dd29-4d0d-acad-fc2423c51707 service nova] Releasing lock "refresh_cache-77ac61b9-48cc-4ae8-81e7-273841f7b42d" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1751.761425] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468002, 'name': CreateVM_Task, 'duration_secs': 0.311837} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1751.761643] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1751.762359] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1751.762536] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1751.762855] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1751.763120] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-859c31b5-c294-4079-bbc0-6e40d2d0c6a9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.767387] env[67899]: DEBUG oslo_vmware.api [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Waiting for the task: (returnval){ [ 1751.767387] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d2da3a-4b4a-ee25-bc52-0085f18acd4a" [ 1751.767387] env[67899]: _type = "Task" [ 1751.767387] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1751.774620] env[67899]: DEBUG oslo_vmware.api [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d2da3a-4b4a-ee25-bc52-0085f18acd4a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1752.276720] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1752.277102] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1752.277179] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1757.996865] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1768.000459] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1769.998111] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1769.998111] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1769.998111] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1770.018252] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1770.018465] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1770.018534] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1770.018647] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1770.018771] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1770.018893] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1770.019033] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1770.019162] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1770.019283] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1770.019423] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1770.019900] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1771.997752] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1771.997752] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1772.997995] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1773.997253] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1774.997160] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1774.997448] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1774.997558] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1774.997686] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1775.006599] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] There are 0 instances to clean {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1775.971135] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_power_states {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1775.990324] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Getting list of instances from cluster (obj){ [ 1775.990324] env[67899]: value = "domain-c8" [ 1775.990324] env[67899]: _type = "ClusterComputeResource" [ 1775.990324] env[67899]: } {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1775.991872] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02fec67e-25b0-41fa-9f77-65fc786eb598 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.009051] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Got total of 9 instances {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1776.009051] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid dc7bf2b7-631d-4933-92db-1679ad823379 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1776.009051] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 03684169-e2c8-4cf5-8e79-b118725927f1 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1776.009051] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 3a077713-f7a2-4a61-bb17-987af6a52c4a {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1776.009051] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid e179db1d-ee0c-4f47-a958-40dd69209d26 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1776.009051] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid addcc88a-6bb5-4a70-938e-49c0c79c8414 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1776.009527] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid a6544af8-879d-4c45-bee4-8551b861fc66 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1776.009527] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 9b4a7c14-84dc-4222-a758-3f8f10e23b7a {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1776.009527] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid e08f620d-63a0-45cb-99c6-d9d95c938b38 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1776.009619] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 77ac61b9-48cc-4ae8-81e7-273841f7b42d {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1776.009968] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "dc7bf2b7-631d-4933-92db-1679ad823379" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.010153] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "03684169-e2c8-4cf5-8e79-b118725927f1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.010366] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "3a077713-f7a2-4a61-bb17-987af6a52c4a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.010564] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "e179db1d-ee0c-4f47-a958-40dd69209d26" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.010774] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "addcc88a-6bb5-4a70-938e-49c0c79c8414" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.010966] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "a6544af8-879d-4c45-bee4-8551b861fc66" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.011172] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.011365] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "e08f620d-63a0-45cb-99c6-d9d95c938b38" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.011598] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.011764] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1776.024238] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.024396] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1776.024560] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1776.024711] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1776.025752] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2761f07-4011-4dae-b3e0-d1c2e1ae4233 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.034203] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cfe1819-5da4-4790-99a7-6b5d607d21a6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.047875] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e61412d-a4b8-4983-a2ca-26ca88aab3e4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.054175] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c11b892c-e806-493b-8682-33cc5ec533a1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.083782] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180939MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1776.083941] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.084155] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1776.227006] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance dc7bf2b7-631d-4933-92db-1679ad823379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1776.227204] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1776.227348] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1776.227474] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e179db1d-ee0c-4f47-a958-40dd69209d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1776.227594] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1776.227709] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1776.227823] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1776.227934] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e08f620d-63a0-45cb-99c6-d9d95c938b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1776.228060] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1776.228307] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1776.228451] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1776.243712] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing inventories for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1776.256056] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating ProviderTree inventory for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1776.256237] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating inventory in ProviderTree for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1776.266712] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing aggregate associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, aggregates: None {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1776.283525] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing trait associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, traits: COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1776.400116] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e135688-bc10-42b3-94ff-6874a5ab6a4d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.408267] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-639a9d34-8e54-4f34-9010-e2849de2dc2f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.438189] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ec29f3b-1a4a-4e3c-81a5-314aaf8b6f5e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.446456] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5070ce63-3604-48c9-84da-f91f6525bf58 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.459507] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1776.469560] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1776.483533] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1776.483681] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.400s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1777.504789] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1777.996324] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1777.996503] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances with incomplete migration {{(pid=67899) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1796.668926] env[67899]: WARNING oslo_vmware.rw_handles [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1796.668926] env[67899]: ERROR oslo_vmware.rw_handles [ 1796.669680] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/32177577-8bfa-4363-870d-123c6cc4a300/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1796.671318] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1796.671532] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Copying Virtual Disk [datastore1] vmware_temp/32177577-8bfa-4363-870d-123c6cc4a300/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/32177577-8bfa-4363-870d-123c6cc4a300/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1796.671858] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-95138451-e469-4743-b3a8-e9703dd9f917 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1796.680479] env[67899]: DEBUG oslo_vmware.api [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Waiting for the task: (returnval){ [ 1796.680479] env[67899]: value = "task-3468003" [ 1796.680479] env[67899]: _type = "Task" [ 1796.680479] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1796.688582] env[67899]: DEBUG oslo_vmware.api [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Task: {'id': task-3468003, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1797.191610] env[67899]: DEBUG oslo_vmware.exceptions [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1797.191921] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1797.192494] env[67899]: ERROR nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1797.192494] env[67899]: Faults: ['InvalidArgument'] [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Traceback (most recent call last): [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] yield resources [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] self.driver.spawn(context, instance, image_meta, [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] self._fetch_image_if_missing(context, vi) [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] image_cache(vi, tmp_image_ds_loc) [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] vm_util.copy_virtual_disk( [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] session._wait_for_task(vmdk_copy_task) [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] return self.wait_for_task(task_ref) [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] return evt.wait() [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] result = hub.switch() [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] return self.greenlet.switch() [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] self.f(*self.args, **self.kw) [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] raise exceptions.translate_fault(task_info.error) [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Faults: ['InvalidArgument'] [ 1797.192494] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] [ 1797.193719] env[67899]: INFO nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Terminating instance [ 1797.194313] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1797.194517] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1797.194750] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-82fdf041-1903-425d-b4ec-4a91235d0ee6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.196870] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1797.197074] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1797.197776] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f9f70f8-70c8-4fd1-8b2c-2721edf4bb44 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.204437] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1797.204645] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4ddebd39-e883-4caa-9211-0e50340a622e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.206684] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1797.206856] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1797.207760] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-746e85e9-744f-4d3f-8b80-71a267983c5e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.212230] env[67899]: DEBUG oslo_vmware.api [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Waiting for the task: (returnval){ [ 1797.212230] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52e90197-23c1-97bd-316d-9c6d73f0d0c8" [ 1797.212230] env[67899]: _type = "Task" [ 1797.212230] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1797.219163] env[67899]: DEBUG oslo_vmware.api [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52e90197-23c1-97bd-316d-9c6d73f0d0c8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1797.274594] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1797.274815] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1797.274990] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Deleting the datastore file [datastore1] dc7bf2b7-631d-4933-92db-1679ad823379 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1797.275272] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-eb44fdf4-4d59-4a1e-8a25-7f711d7de8cd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.281011] env[67899]: DEBUG oslo_vmware.api [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Waiting for the task: (returnval){ [ 1797.281011] env[67899]: value = "task-3468005" [ 1797.281011] env[67899]: _type = "Task" [ 1797.281011] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1797.288315] env[67899]: DEBUG oslo_vmware.api [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Task: {'id': task-3468005, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1797.722256] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1797.722641] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Creating directory with path [datastore1] vmware_temp/275afc77-8e43-45f4-b1cd-31b808e54e50/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1797.722778] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cfbf417f-f6a6-4101-8c64-344da66dd915 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.734972] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Created directory with path [datastore1] vmware_temp/275afc77-8e43-45f4-b1cd-31b808e54e50/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1797.735194] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Fetch image to [datastore1] vmware_temp/275afc77-8e43-45f4-b1cd-31b808e54e50/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1797.735338] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/275afc77-8e43-45f4-b1cd-31b808e54e50/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1797.736088] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b10244c-8ea0-49fb-829c-c279688c5681 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.742682] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b4e514b-f807-4c7c-90b2-28f9c76e31f6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.751584] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2aa304ad-c648-4403-9765-53e56e6a3b75 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.784918] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65e11da1-f07d-485c-8f96-ec2bea47ad6e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.791524] env[67899]: DEBUG oslo_vmware.api [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Task: {'id': task-3468005, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075566} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1797.792879] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1797.793080] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1797.793250] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1797.793419] env[67899]: INFO nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1797.795104] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5b566d83-a980-4827-bd86-24fdf02693a3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.796943] env[67899]: DEBUG nova.compute.claims [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1797.797127] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1797.797340] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1797.819126] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1797.962349] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/275afc77-8e43-45f4-b1cd-31b808e54e50/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1798.017747] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8140c140-3180-4935-8181-04050505418b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.022444] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1798.022636] env[67899]: DEBUG oslo_vmware.rw_handles [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/275afc77-8e43-45f4-b1cd-31b808e54e50/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1798.026269] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6984abf9-85fe-4de8-a63b-6f4c4b86842a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.056027] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd7ab585-ea79-4163-ae0f-1d6b407bfd17 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.061633] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62822c85-6cc8-4242-967c-2a548fca26a0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.076136] env[67899]: DEBUG nova.compute.provider_tree [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1798.083446] env[67899]: DEBUG nova.scheduler.client.report [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1798.098098] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.301s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1798.098623] env[67899]: ERROR nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1798.098623] env[67899]: Faults: ['InvalidArgument'] [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Traceback (most recent call last): [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] self.driver.spawn(context, instance, image_meta, [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] self._fetch_image_if_missing(context, vi) [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] image_cache(vi, tmp_image_ds_loc) [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] vm_util.copy_virtual_disk( [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] session._wait_for_task(vmdk_copy_task) [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] return self.wait_for_task(task_ref) [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] return evt.wait() [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] result = hub.switch() [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] return self.greenlet.switch() [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] self.f(*self.args, **self.kw) [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] raise exceptions.translate_fault(task_info.error) [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Faults: ['InvalidArgument'] [ 1798.098623] env[67899]: ERROR nova.compute.manager [instance: dc7bf2b7-631d-4933-92db-1679ad823379] [ 1798.099705] env[67899]: DEBUG nova.compute.utils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1798.100893] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Build of instance dc7bf2b7-631d-4933-92db-1679ad823379 was re-scheduled: A specified parameter was not correct: fileType [ 1798.100893] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1798.101325] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1798.101525] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1798.101864] env[67899]: DEBUG nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1798.102076] env[67899]: DEBUG nova.network.neutron [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1798.425696] env[67899]: DEBUG nova.network.neutron [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1798.444417] env[67899]: INFO nova.compute.manager [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Took 0.34 seconds to deallocate network for instance. [ 1798.543615] env[67899]: INFO nova.scheduler.client.report [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Deleted allocations for instance dc7bf2b7-631d-4933-92db-1679ad823379 [ 1798.571593] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2236b4c1-ddc4-48e2-b219-bf8992f19847 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "dc7bf2b7-631d-4933-92db-1679ad823379" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 575.737s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1798.572046] env[67899]: DEBUG oslo_concurrency.lockutils [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "dc7bf2b7-631d-4933-92db-1679ad823379" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 379.940s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1798.572258] env[67899]: DEBUG oslo_concurrency.lockutils [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Acquiring lock "dc7bf2b7-631d-4933-92db-1679ad823379-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1798.572550] env[67899]: DEBUG oslo_concurrency.lockutils [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "dc7bf2b7-631d-4933-92db-1679ad823379-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1798.572672] env[67899]: DEBUG oslo_concurrency.lockutils [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "dc7bf2b7-631d-4933-92db-1679ad823379-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1798.575792] env[67899]: INFO nova.compute.manager [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Terminating instance [ 1798.577179] env[67899]: DEBUG nova.compute.manager [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1798.577357] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1798.577826] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4bd65220-f303-4964-8c23-5c183c0939d0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.587241] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fde48f5-9968-4092-8b3d-612dead917e4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1798.617274] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dc7bf2b7-631d-4933-92db-1679ad823379 could not be found. [ 1798.617481] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1798.617657] env[67899]: INFO nova.compute.manager [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1798.617894] env[67899]: DEBUG oslo.service.loopingcall [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1798.618128] env[67899]: DEBUG nova.compute.manager [-] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1798.618225] env[67899]: DEBUG nova.network.neutron [-] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1798.641058] env[67899]: DEBUG nova.network.neutron [-] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1798.649242] env[67899]: INFO nova.compute.manager [-] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] Took 0.03 seconds to deallocate network for instance. [ 1798.760985] env[67899]: DEBUG oslo_concurrency.lockutils [None req-83531eca-9f13-4643-a930-67182badfa53 tempest-InstanceActionsV221TestJSON-802466451 tempest-InstanceActionsV221TestJSON-802466451-project-member] Lock "dc7bf2b7-631d-4933-92db-1679ad823379" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.189s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1798.761860] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "dc7bf2b7-631d-4933-92db-1679ad823379" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 22.752s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1798.762056] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: dc7bf2b7-631d-4933-92db-1679ad823379] During sync_power_state the instance has a pending task (deleting). Skip. [ 1798.762232] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "dc7bf2b7-631d-4933-92db-1679ad823379" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1815.900441] env[67899]: DEBUG oslo_concurrency.lockutils [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquiring lock "e08f620d-63a0-45cb-99c6-d9d95c938b38" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1828.003089] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1828.831371] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1828.831612] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1828.842159] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1828.890515] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1828.890710] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1828.892123] env[67899]: INFO nova.compute.claims [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1829.041945] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c83e885a-1b3b-485c-bad8-7f2f640ccc3e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.049047] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8816169-498b-4c29-b6ff-8c24b164b3fc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.078310] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a99cfe7c-edf6-435d-bb33-dfa005d4519f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.084769] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08fdbad9-4e35-4ad6-a576-30b188e0ddd3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.098193] env[67899]: DEBUG nova.compute.provider_tree [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1829.106669] env[67899]: DEBUG nova.scheduler.client.report [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1829.119490] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.229s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1829.119948] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1829.149396] env[67899]: DEBUG nova.compute.utils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1829.150763] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1829.150928] env[67899]: DEBUG nova.network.neutron [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1829.160097] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1829.213593] env[67899]: DEBUG nova.policy [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5206226ca404a07b10db199a6436504', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bdf895619b34412fb20488318e170d23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1829.227239] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1829.248234] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1829.248486] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1829.248641] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1829.248818] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1829.248963] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1829.249124] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1829.249332] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1829.249488] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1829.249657] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1829.249841] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1829.250035] env[67899]: DEBUG nova.virt.hardware [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1829.250886] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99cc80ae-82d5-4371-8bde-264d979255c2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.258603] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59ed8286-610a-431f-a2b6-ecd12e485ec1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.523262] env[67899]: DEBUG nova.network.neutron [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Successfully created port: 97e0b145-e48f-47b5-837a-264619d2b3f0 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1830.116733] env[67899]: DEBUG nova.compute.manager [req-999cfc5e-bca5-4ac5-84db-573138ba3816 req-8b40eb8f-34ea-437a-88b1-a564ba77a402 service nova] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Received event network-vif-plugged-97e0b145-e48f-47b5-837a-264619d2b3f0 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1830.117024] env[67899]: DEBUG oslo_concurrency.lockutils [req-999cfc5e-bca5-4ac5-84db-573138ba3816 req-8b40eb8f-34ea-437a-88b1-a564ba77a402 service nova] Acquiring lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1830.117174] env[67899]: DEBUG oslo_concurrency.lockutils [req-999cfc5e-bca5-4ac5-84db-573138ba3816 req-8b40eb8f-34ea-437a-88b1-a564ba77a402 service nova] Lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1830.117343] env[67899]: DEBUG oslo_concurrency.lockutils [req-999cfc5e-bca5-4ac5-84db-573138ba3816 req-8b40eb8f-34ea-437a-88b1-a564ba77a402 service nova] Lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1830.117509] env[67899]: DEBUG nova.compute.manager [req-999cfc5e-bca5-4ac5-84db-573138ba3816 req-8b40eb8f-34ea-437a-88b1-a564ba77a402 service nova] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] No waiting events found dispatching network-vif-plugged-97e0b145-e48f-47b5-837a-264619d2b3f0 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1830.117671] env[67899]: WARNING nova.compute.manager [req-999cfc5e-bca5-4ac5-84db-573138ba3816 req-8b40eb8f-34ea-437a-88b1-a564ba77a402 service nova] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Received unexpected event network-vif-plugged-97e0b145-e48f-47b5-837a-264619d2b3f0 for instance with vm_state building and task_state spawning. [ 1830.193629] env[67899]: DEBUG nova.network.neutron [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Successfully updated port: 97e0b145-e48f-47b5-837a-264619d2b3f0 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1830.205558] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "refresh_cache-a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1830.205709] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "refresh_cache-a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1830.205859] env[67899]: DEBUG nova.network.neutron [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1830.241915] env[67899]: DEBUG nova.network.neutron [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1830.465557] env[67899]: DEBUG nova.network.neutron [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Updating instance_info_cache with network_info: [{"id": "97e0b145-e48f-47b5-837a-264619d2b3f0", "address": "fa:16:3e:fc:58:37", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97e0b145-e4", "ovs_interfaceid": "97e0b145-e48f-47b5-837a-264619d2b3f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1830.475525] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "refresh_cache-a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1830.475785] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Instance network_info: |[{"id": "97e0b145-e48f-47b5-837a-264619d2b3f0", "address": "fa:16:3e:fc:58:37", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97e0b145-e4", "ovs_interfaceid": "97e0b145-e48f-47b5-837a-264619d2b3f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1830.476290] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fc:58:37', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '357d2811-e990-4985-9f9e-b158d10d3699', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '97e0b145-e48f-47b5-837a-264619d2b3f0', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1830.484078] env[67899]: DEBUG oslo.service.loopingcall [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1830.484518] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1830.484754] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-44d73d2c-7912-4e17-a6b1-0d8c5efe26a5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.504725] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1830.504725] env[67899]: value = "task-3468006" [ 1830.504725] env[67899]: _type = "Task" [ 1830.504725] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1830.511947] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468006, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1831.014901] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468006, 'name': CreateVM_Task} progress is 25%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1831.516918] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468006, 'name': CreateVM_Task, 'duration_secs': 0.753469} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1831.517260] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1831.517824] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1831.517947] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1831.518317] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1831.518596] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7a98f5d9-491e-43fa-a060-42a17521c41c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.523075] env[67899]: DEBUG oslo_vmware.api [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 1831.523075] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]525e17f3-c935-8bce-f0d1-2d1365b13daa" [ 1831.523075] env[67899]: _type = "Task" [ 1831.523075] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1831.530763] env[67899]: DEBUG oslo_vmware.api [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]525e17f3-c935-8bce-f0d1-2d1365b13daa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1831.996940] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1831.997251] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1831.997386] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1832.017734] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1832.017885] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1832.018030] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1832.018860] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1832.018860] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1832.018860] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1832.018860] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1832.018860] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1832.018860] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1832.018860] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1832.019328] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1832.033218] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1832.033450] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1832.033655] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1832.144476] env[67899]: DEBUG nova.compute.manager [req-0a3f1a2b-934b-43d1-9c04-a4e52c52361b req-7d33cb1c-6ffd-4cc0-9791-e88b094ed895 service nova] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Received event network-changed-97e0b145-e48f-47b5-837a-264619d2b3f0 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1832.144678] env[67899]: DEBUG nova.compute.manager [req-0a3f1a2b-934b-43d1-9c04-a4e52c52361b req-7d33cb1c-6ffd-4cc0-9791-e88b094ed895 service nova] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Refreshing instance network info cache due to event network-changed-97e0b145-e48f-47b5-837a-264619d2b3f0. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1832.144844] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a3f1a2b-934b-43d1-9c04-a4e52c52361b req-7d33cb1c-6ffd-4cc0-9791-e88b094ed895 service nova] Acquiring lock "refresh_cache-a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1832.144986] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a3f1a2b-934b-43d1-9c04-a4e52c52361b req-7d33cb1c-6ffd-4cc0-9791-e88b094ed895 service nova] Acquired lock "refresh_cache-a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1832.145161] env[67899]: DEBUG nova.network.neutron [req-0a3f1a2b-934b-43d1-9c04-a4e52c52361b req-7d33cb1c-6ffd-4cc0-9791-e88b094ed895 service nova] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Refreshing network info cache for port 97e0b145-e48f-47b5-837a-264619d2b3f0 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1832.584713] env[67899]: DEBUG nova.network.neutron [req-0a3f1a2b-934b-43d1-9c04-a4e52c52361b req-7d33cb1c-6ffd-4cc0-9791-e88b094ed895 service nova] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Updated VIF entry in instance network info cache for port 97e0b145-e48f-47b5-837a-264619d2b3f0. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1832.585079] env[67899]: DEBUG nova.network.neutron [req-0a3f1a2b-934b-43d1-9c04-a4e52c52361b req-7d33cb1c-6ffd-4cc0-9791-e88b094ed895 service nova] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Updating instance_info_cache with network_info: [{"id": "97e0b145-e48f-47b5-837a-264619d2b3f0", "address": "fa:16:3e:fc:58:37", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97e0b145-e4", "ovs_interfaceid": "97e0b145-e48f-47b5-837a-264619d2b3f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1832.594228] env[67899]: DEBUG oslo_concurrency.lockutils [req-0a3f1a2b-934b-43d1-9c04-a4e52c52361b req-7d33cb1c-6ffd-4cc0-9791-e88b094ed895 service nova] Releasing lock "refresh_cache-a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1832.996257] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1833.996411] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1833.996659] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1834.505197] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "c4fe8b3e-cee1-401b-a26f-907a8de95eba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1834.505710] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "c4fe8b3e-cee1-401b-a26f-907a8de95eba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1834.515889] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1834.565501] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1834.565751] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1834.567215] env[67899]: INFO nova.compute.claims [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1834.750782] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23c77cae-4a52-42fd-b3b2-931a8cd61d24 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1834.758391] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26fb4ef0-762a-4f4e-9105-35f11a5c9505 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1834.788536] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66cb00d7-08ca-4b72-9702-ab51df08e9d5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1834.795155] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f1094b3-2217-4c52-88ef-43d9716a6270 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1834.807889] env[67899]: DEBUG nova.compute.provider_tree [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1834.817931] env[67899]: DEBUG nova.scheduler.client.report [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1834.834701] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1834.835181] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1834.866789] env[67899]: DEBUG nova.compute.utils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1834.867958] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Not allocating networking since 'none' was specified. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1834.876541] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1834.941623] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1834.966647] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1834.966957] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1834.967285] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1834.967512] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1834.967667] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1834.967817] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1834.968037] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1834.968203] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1834.968369] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1834.969027] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1834.969027] env[67899]: DEBUG nova.virt.hardware [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1834.969541] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fb21cca-b84a-44ac-9423-d986f755461c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1834.977189] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8f7d532-e742-4b70-99e3-0fda8f67d6ff {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1834.990257] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Instance VIF info [] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1834.995633] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Creating folder: Project (972e93cb34a746cf8878b6e7f3c4e14a). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1834.996048] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1834.996196] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1834.996359] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c0d9ad6d-ad2f-4620-98eb-59f3ea6843b2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1835.005516] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Created folder: Project (972e93cb34a746cf8878b6e7f3c4e14a) in parent group-v692900. [ 1835.005783] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Creating folder: Instances. Parent ref: group-v693006. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1835.005879] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c0957abf-5d38-421d-af0a-058c026296a4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1835.013778] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Created folder: Instances in parent group-v693006. [ 1835.013997] env[67899]: DEBUG oslo.service.loopingcall [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1835.014200] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1835.014369] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cfe45709-7633-4201-9be8-e176db6522c3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1835.029822] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1835.029822] env[67899]: value = "task-3468009" [ 1835.029822] env[67899]: _type = "Task" [ 1835.029822] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1835.036714] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468009, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1835.541701] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468009, 'name': CreateVM_Task, 'duration_secs': 0.262886} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1835.541908] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1835.542316] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1835.542474] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1835.543187] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1835.543408] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fea4ab72-3aaa-4b3a-ac3a-e040744bcf6f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1835.548093] env[67899]: DEBUG oslo_vmware.api [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Waiting for the task: (returnval){ [ 1835.548093] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52710ba0-b29e-6de0-3a9f-49afbc057dea" [ 1835.548093] env[67899]: _type = "Task" [ 1835.548093] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1835.556181] env[67899]: DEBUG oslo_vmware.api [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52710ba0-b29e-6de0-3a9f-49afbc057dea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1835.995886] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1835.996151] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1836.007877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1836.007877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1836.007877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1836.008260] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1836.009119] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c7c6f8b-9621-4dc0-9faf-fa0ba2121e99 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1836.017887] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-631eb535-6776-4edd-aae6-ba956f1750d6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1836.031714] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e13e990-24f0-401a-9e03-973a924c84ce {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1836.038017] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09ff8817-db93-48d2-8c5d-c9dc3937fdd3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1836.067337] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180910MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1836.067502] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1836.067599] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1836.076888] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1836.077154] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1836.077366] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1836.136862] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03684169-e2c8-4cf5-8e79-b118725927f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.137063] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.137210] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e179db1d-ee0c-4f47-a958-40dd69209d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.137333] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.137452] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.137567] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.137679] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e08f620d-63a0-45cb-99c6-d9d95c938b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.137792] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.137902] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.138023] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c4fe8b3e-cee1-401b-a26f-907a8de95eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1836.138214] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1836.138351] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1836.256819] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-644e844d-6df3-4992-b500-8e71a44b6821 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1836.264301] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7181336-0e78-4548-8b2e-ff6f50263fe4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1836.295423] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3e3be8d-2660-47f6-a89e-eb3d26f9d62b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1836.303304] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38b0da38-651e-4dd7-a347-479ac06edcbc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1836.316389] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1836.324313] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1836.338236] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1836.338426] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1840.425596] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "a993c6a9-140f-430d-a77e-98c2567bf7af" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1840.425877] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a993c6a9-140f-430d-a77e-98c2567bf7af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1840.451234] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "c17d88cf-69ba-43e9-a672-24503c65e9f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1840.451457] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "c17d88cf-69ba-43e9-a672-24503c65e9f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1844.286075] env[67899]: WARNING oslo_vmware.rw_handles [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1844.286075] env[67899]: ERROR oslo_vmware.rw_handles [ 1844.286743] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/275afc77-8e43-45f4-b1cd-31b808e54e50/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1844.288364] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1844.288609] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Copying Virtual Disk [datastore1] vmware_temp/275afc77-8e43-45f4-b1cd-31b808e54e50/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/275afc77-8e43-45f4-b1cd-31b808e54e50/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1844.288889] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6ce02a34-746c-4739-abcc-7e01599f8994 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.297983] env[67899]: DEBUG oslo_vmware.api [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Waiting for the task: (returnval){ [ 1844.297983] env[67899]: value = "task-3468010" [ 1844.297983] env[67899]: _type = "Task" [ 1844.297983] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1844.305423] env[67899]: DEBUG oslo_vmware.api [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Task: {'id': task-3468010, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1844.810096] env[67899]: DEBUG oslo_vmware.exceptions [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1844.810400] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1844.810936] env[67899]: ERROR nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1844.810936] env[67899]: Faults: ['InvalidArgument'] [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Traceback (most recent call last): [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] yield resources [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] self.driver.spawn(context, instance, image_meta, [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] self._fetch_image_if_missing(context, vi) [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] image_cache(vi, tmp_image_ds_loc) [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] vm_util.copy_virtual_disk( [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] session._wait_for_task(vmdk_copy_task) [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] return self.wait_for_task(task_ref) [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] return evt.wait() [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] result = hub.switch() [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] return self.greenlet.switch() [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] self.f(*self.args, **self.kw) [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] raise exceptions.translate_fault(task_info.error) [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Faults: ['InvalidArgument'] [ 1844.810936] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] [ 1844.812020] env[67899]: INFO nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Terminating instance [ 1844.812868] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1844.813020] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1844.813606] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1844.813788] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1844.814019] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a41f0d17-26b8-455f-93fd-8a0d063f9c76 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.816204] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db40b177-092c-4f56-9e9a-831d90f44507 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.823014] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1844.823232] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-97443b28-01b7-49a2-9829-716d9b7a1a3e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.825359] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1844.825528] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1844.826433] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4179e267-241c-410b-b108-e79d7b456aa7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.830942] env[67899]: DEBUG oslo_vmware.api [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Waiting for the task: (returnval){ [ 1844.830942] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5234dc4d-a8df-813e-c69c-24e27bacdcf9" [ 1844.830942] env[67899]: _type = "Task" [ 1844.830942] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1844.840665] env[67899]: DEBUG oslo_vmware.api [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5234dc4d-a8df-813e-c69c-24e27bacdcf9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1844.901733] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1844.902011] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1844.902225] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Deleting the datastore file [datastore1] 03684169-e2c8-4cf5-8e79-b118725927f1 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1844.902495] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-81c59154-aba9-4aa3-b103-d51caf4ad424 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.909259] env[67899]: DEBUG oslo_vmware.api [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Waiting for the task: (returnval){ [ 1844.909259] env[67899]: value = "task-3468012" [ 1844.909259] env[67899]: _type = "Task" [ 1844.909259] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1844.917015] env[67899]: DEBUG oslo_vmware.api [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Task: {'id': task-3468012, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1845.341045] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1845.341402] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Creating directory with path [datastore1] vmware_temp/778227d4-e20b-4969-9ff1-aeca34da15a7/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1845.341402] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-33d61dd5-8d8f-4f48-87b9-cd14559178b1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.353011] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Created directory with path [datastore1] vmware_temp/778227d4-e20b-4969-9ff1-aeca34da15a7/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1845.353212] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Fetch image to [datastore1] vmware_temp/778227d4-e20b-4969-9ff1-aeca34da15a7/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1845.353368] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/778227d4-e20b-4969-9ff1-aeca34da15a7/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1845.354077] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df049ad7-b142-4011-a630-6fda7da40a89 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.360282] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63b2ad45-47fe-405b-9d42-62879ed525d8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.369201] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b001d579-97bc-4359-9b1d-1cbc128f3ef9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.398804] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfa71ffb-d154-4002-ab31-475011561e3e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.404747] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fdbbcd0b-9fcf-4c4a-9fcc-c2797623d2d4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.417105] env[67899]: DEBUG oslo_vmware.api [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Task: {'id': task-3468012, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070238} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1845.417344] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1845.417524] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1845.417690] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1845.417862] env[67899]: INFO nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1845.419924] env[67899]: DEBUG nova.compute.claims [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1845.420103] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.420371] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.429320] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1845.482710] env[67899]: DEBUG oslo_vmware.rw_handles [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/778227d4-e20b-4969-9ff1-aeca34da15a7/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1845.541608] env[67899]: DEBUG oslo_vmware.rw_handles [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1845.541797] env[67899]: DEBUG oslo_vmware.rw_handles [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/778227d4-e20b-4969-9ff1-aeca34da15a7/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1845.666239] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52e80bb7-6b9c-48a8-a093-b4b884f6dd3c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.673692] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dfeb8c7-b1f8-4e8b-aeb3-848bfbcb2d80 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.702027] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9104e543-b727-48c3-ab95-80b8451fc404 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.708506] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ea37d31-bba3-4080-a804-e04f9ce0dad3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.720901] env[67899]: DEBUG nova.compute.provider_tree [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1845.729961] env[67899]: DEBUG nova.scheduler.client.report [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1845.743682] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.323s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.744190] env[67899]: ERROR nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1845.744190] env[67899]: Faults: ['InvalidArgument'] [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Traceback (most recent call last): [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] self.driver.spawn(context, instance, image_meta, [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] self._fetch_image_if_missing(context, vi) [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] image_cache(vi, tmp_image_ds_loc) [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] vm_util.copy_virtual_disk( [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] session._wait_for_task(vmdk_copy_task) [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] return self.wait_for_task(task_ref) [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] return evt.wait() [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] result = hub.switch() [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] return self.greenlet.switch() [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] self.f(*self.args, **self.kw) [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] raise exceptions.translate_fault(task_info.error) [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Faults: ['InvalidArgument'] [ 1845.744190] env[67899]: ERROR nova.compute.manager [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] [ 1845.745034] env[67899]: DEBUG nova.compute.utils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1845.746502] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Build of instance 03684169-e2c8-4cf5-8e79-b118725927f1 was re-scheduled: A specified parameter was not correct: fileType [ 1845.746502] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1845.746875] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1845.747053] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1845.747231] env[67899]: DEBUG nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1845.747390] env[67899]: DEBUG nova.network.neutron [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1846.056516] env[67899]: DEBUG nova.network.neutron [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1846.067881] env[67899]: INFO nova.compute.manager [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Took 0.32 seconds to deallocate network for instance. [ 1846.157072] env[67899]: INFO nova.scheduler.client.report [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Deleted allocations for instance 03684169-e2c8-4cf5-8e79-b118725927f1 [ 1846.179316] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4f3b59ba-f200-4682-b584-0c61035dfcc2 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "03684169-e2c8-4cf5-8e79-b118725927f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 615.485s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.180466] env[67899]: DEBUG oslo_concurrency.lockutils [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "03684169-e2c8-4cf5-8e79-b118725927f1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 418.944s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.180685] env[67899]: DEBUG oslo_concurrency.lockutils [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Acquiring lock "03684169-e2c8-4cf5-8e79-b118725927f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1846.180893] env[67899]: DEBUG oslo_concurrency.lockutils [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "03684169-e2c8-4cf5-8e79-b118725927f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.181061] env[67899]: DEBUG oslo_concurrency.lockutils [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "03684169-e2c8-4cf5-8e79-b118725927f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.183061] env[67899]: INFO nova.compute.manager [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Terminating instance [ 1846.184727] env[67899]: DEBUG nova.compute.manager [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1846.184916] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1846.185407] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-78688eff-4cce-41f6-8741-c5d7c12e77f6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.196122] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27e963c8-2952-4094-803d-61bdba0da363 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.206959] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1846.227613] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 03684169-e2c8-4cf5-8e79-b118725927f1 could not be found. [ 1846.227829] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1846.228015] env[67899]: INFO nova.compute.manager [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1846.228269] env[67899]: DEBUG oslo.service.loopingcall [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1846.228509] env[67899]: DEBUG nova.compute.manager [-] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1846.228606] env[67899]: DEBUG nova.network.neutron [-] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1846.252411] env[67899]: DEBUG nova.network.neutron [-] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1846.259094] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1846.259343] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.260749] env[67899]: INFO nova.compute.claims [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1846.263824] env[67899]: INFO nova.compute.manager [-] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] Took 0.04 seconds to deallocate network for instance. [ 1846.348813] env[67899]: DEBUG oslo_concurrency.lockutils [None req-8bf23f3f-a113-4465-a1a5-030e0b144f69 tempest-SecurityGroupsTestJSON-311274219 tempest-SecurityGroupsTestJSON-311274219-project-member] Lock "03684169-e2c8-4cf5-8e79-b118725927f1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.349692] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "03684169-e2c8-4cf5-8e79-b118725927f1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 70.339s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.350970] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03684169-e2c8-4cf5-8e79-b118725927f1] During sync_power_state the instance has a pending task (deleting). Skip. [ 1846.351189] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "03684169-e2c8-4cf5-8e79-b118725927f1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.002s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.458552] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5d8f279-962e-4633-9fe9-46177299e544 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.467767] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d805a65-10f3-47ef-833e-38dda4001bca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.500235] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b86c403-093e-44e9-ad90-2bce8ec7664d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.507894] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fa5b65f-9f3c-4495-af61-01c14090d5a7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.521735] env[67899]: DEBUG nova.compute.provider_tree [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1846.530720] env[67899]: DEBUG nova.scheduler.client.report [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1846.544710] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.545259] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1846.586727] env[67899]: DEBUG nova.compute.utils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1846.589055] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1846.589292] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1846.600465] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1846.677190] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1846.680735] env[67899]: DEBUG nova.policy [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27cd4ea8990b48be8c1f2455a264a858', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '288109a7b3bf4e3a9628184485e4679b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1846.709792] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1846.710077] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1846.710242] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1846.710425] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1846.710572] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1846.710719] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1846.710929] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1846.711220] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1846.711422] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1846.711590] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1846.711764] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1846.712661] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b08c37c3-6405-4de6-a0c8-3e03144f5214 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.721356] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f36bc1a-a0a0-4506-9ab6-c13a27d6f163 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.400204] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Successfully created port: 990c95d6-44c0-454a-a5f4-b43c40c53dfd {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1848.171733] env[67899]: DEBUG nova.compute.manager [req-0dcbb892-f0c6-4d56-b8c0-4b5c54782a11 req-9b264903-e598-4d07-82cd-56bf640b9056 service nova] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Received event network-vif-plugged-990c95d6-44c0-454a-a5f4-b43c40c53dfd {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1848.171994] env[67899]: DEBUG oslo_concurrency.lockutils [req-0dcbb892-f0c6-4d56-b8c0-4b5c54782a11 req-9b264903-e598-4d07-82cd-56bf640b9056 service nova] Acquiring lock "a993c6a9-140f-430d-a77e-98c2567bf7af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1848.172223] env[67899]: DEBUG oslo_concurrency.lockutils [req-0dcbb892-f0c6-4d56-b8c0-4b5c54782a11 req-9b264903-e598-4d07-82cd-56bf640b9056 service nova] Lock "a993c6a9-140f-430d-a77e-98c2567bf7af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1848.172393] env[67899]: DEBUG oslo_concurrency.lockutils [req-0dcbb892-f0c6-4d56-b8c0-4b5c54782a11 req-9b264903-e598-4d07-82cd-56bf640b9056 service nova] Lock "a993c6a9-140f-430d-a77e-98c2567bf7af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1848.172562] env[67899]: DEBUG nova.compute.manager [req-0dcbb892-f0c6-4d56-b8c0-4b5c54782a11 req-9b264903-e598-4d07-82cd-56bf640b9056 service nova] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] No waiting events found dispatching network-vif-plugged-990c95d6-44c0-454a-a5f4-b43c40c53dfd {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1848.172781] env[67899]: WARNING nova.compute.manager [req-0dcbb892-f0c6-4d56-b8c0-4b5c54782a11 req-9b264903-e598-4d07-82cd-56bf640b9056 service nova] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Received unexpected event network-vif-plugged-990c95d6-44c0-454a-a5f4-b43c40c53dfd for instance with vm_state building and task_state spawning. [ 1848.261291] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Successfully updated port: 990c95d6-44c0-454a-a5f4-b43c40c53dfd {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1848.276293] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "refresh_cache-a993c6a9-140f-430d-a77e-98c2567bf7af" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1848.276293] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired lock "refresh_cache-a993c6a9-140f-430d-a77e-98c2567bf7af" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1848.276452] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1848.334794] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1848.503552] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Updating instance_info_cache with network_info: [{"id": "990c95d6-44c0-454a-a5f4-b43c40c53dfd", "address": "fa:16:3e:66:af:90", "network": {"id": "26d51439-fee0-42d9-ac79-0e886ae3cf6e", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1360921386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "288109a7b3bf4e3a9628184485e4679b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap990c95d6-44", "ovs_interfaceid": "990c95d6-44c0-454a-a5f4-b43c40c53dfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1848.514711] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Releasing lock "refresh_cache-a993c6a9-140f-430d-a77e-98c2567bf7af" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1848.515067] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Instance network_info: |[{"id": "990c95d6-44c0-454a-a5f4-b43c40c53dfd", "address": "fa:16:3e:66:af:90", "network": {"id": "26d51439-fee0-42d9-ac79-0e886ae3cf6e", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1360921386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "288109a7b3bf4e3a9628184485e4679b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap990c95d6-44", "ovs_interfaceid": "990c95d6-44c0-454a-a5f4-b43c40c53dfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1848.515419] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:66:af:90', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '990c95d6-44c0-454a-a5f4-b43c40c53dfd', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1848.522919] env[67899]: DEBUG oslo.service.loopingcall [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1848.523416] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1848.523656] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5635fc9c-f8b8-4e43-a516-8920a490be38 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.544953] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1848.544953] env[67899]: value = "task-3468013" [ 1848.544953] env[67899]: _type = "Task" [ 1848.544953] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1848.553419] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468013, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1849.055665] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468013, 'name': CreateVM_Task, 'duration_secs': 0.2872} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1849.055850] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1849.056526] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1849.056689] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1849.057010] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1849.057256] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cd230b47-e757-4210-8e9c-cc0ca9c98027 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1849.061804] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 1849.061804] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52527cc1-9f46-b570-91dc-dac350e6e02c" [ 1849.061804] env[67899]: _type = "Task" [ 1849.061804] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1849.069447] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52527cc1-9f46-b570-91dc-dac350e6e02c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1849.572721] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1849.573164] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1849.573164] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1850.204345] env[67899]: DEBUG nova.compute.manager [req-d674778b-a7c9-4fcc-bfd9-c4452a66e11c req-f2b62100-2642-4632-8d84-7a425bc6f4c0 service nova] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Received event network-changed-990c95d6-44c0-454a-a5f4-b43c40c53dfd {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1850.204560] env[67899]: DEBUG nova.compute.manager [req-d674778b-a7c9-4fcc-bfd9-c4452a66e11c req-f2b62100-2642-4632-8d84-7a425bc6f4c0 service nova] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Refreshing instance network info cache due to event network-changed-990c95d6-44c0-454a-a5f4-b43c40c53dfd. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1850.204779] env[67899]: DEBUG oslo_concurrency.lockutils [req-d674778b-a7c9-4fcc-bfd9-c4452a66e11c req-f2b62100-2642-4632-8d84-7a425bc6f4c0 service nova] Acquiring lock "refresh_cache-a993c6a9-140f-430d-a77e-98c2567bf7af" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1850.204919] env[67899]: DEBUG oslo_concurrency.lockutils [req-d674778b-a7c9-4fcc-bfd9-c4452a66e11c req-f2b62100-2642-4632-8d84-7a425bc6f4c0 service nova] Acquired lock "refresh_cache-a993c6a9-140f-430d-a77e-98c2567bf7af" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1850.205278] env[67899]: DEBUG nova.network.neutron [req-d674778b-a7c9-4fcc-bfd9-c4452a66e11c req-f2b62100-2642-4632-8d84-7a425bc6f4c0 service nova] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Refreshing network info cache for port 990c95d6-44c0-454a-a5f4-b43c40c53dfd {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1850.496665] env[67899]: DEBUG nova.network.neutron [req-d674778b-a7c9-4fcc-bfd9-c4452a66e11c req-f2b62100-2642-4632-8d84-7a425bc6f4c0 service nova] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Updated VIF entry in instance network info cache for port 990c95d6-44c0-454a-a5f4-b43c40c53dfd. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1850.496914] env[67899]: DEBUG nova.network.neutron [req-d674778b-a7c9-4fcc-bfd9-c4452a66e11c req-f2b62100-2642-4632-8d84-7a425bc6f4c0 service nova] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Updating instance_info_cache with network_info: [{"id": "990c95d6-44c0-454a-a5f4-b43c40c53dfd", "address": "fa:16:3e:66:af:90", "network": {"id": "26d51439-fee0-42d9-ac79-0e886ae3cf6e", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1360921386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "288109a7b3bf4e3a9628184485e4679b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap990c95d6-44", "ovs_interfaceid": "990c95d6-44c0-454a-a5f4-b43c40c53dfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1850.510736] env[67899]: DEBUG oslo_concurrency.lockutils [req-d674778b-a7c9-4fcc-bfd9-c4452a66e11c req-f2b62100-2642-4632-8d84-7a425bc6f4c0 service nova] Releasing lock "refresh_cache-a993c6a9-140f-430d-a77e-98c2567bf7af" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1852.833045] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "483824d1-4994-436a-ba16-12524684405c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.833378] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "483824d1-4994-436a-ba16-12524684405c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1876.897301] env[67899]: DEBUG oslo_concurrency.lockutils [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1888.335146] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1891.544460] env[67899]: WARNING oslo_vmware.rw_handles [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1891.544460] env[67899]: ERROR oslo_vmware.rw_handles [ 1891.545222] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/778227d4-e20b-4969-9ff1-aeca34da15a7/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1891.547463] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1891.547732] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Copying Virtual Disk [datastore1] vmware_temp/778227d4-e20b-4969-9ff1-aeca34da15a7/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/778227d4-e20b-4969-9ff1-aeca34da15a7/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1891.548086] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-62737fcb-7433-48e2-a856-68c41f9ca14b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1891.556254] env[67899]: DEBUG oslo_vmware.api [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Waiting for the task: (returnval){ [ 1891.556254] env[67899]: value = "task-3468014" [ 1891.556254] env[67899]: _type = "Task" [ 1891.556254] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1891.564311] env[67899]: DEBUG oslo_vmware.api [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Task: {'id': task-3468014, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1891.996476] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1892.066554] env[67899]: DEBUG oslo_vmware.exceptions [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1892.066829] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1892.067428] env[67899]: ERROR nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1892.067428] env[67899]: Faults: ['InvalidArgument'] [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Traceback (most recent call last): [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] yield resources [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self.driver.spawn(context, instance, image_meta, [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._fetch_image_if_missing(context, vi) [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] image_cache(vi, tmp_image_ds_loc) [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] vm_util.copy_virtual_disk( [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] session._wait_for_task(vmdk_copy_task) [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.wait_for_task(task_ref) [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return evt.wait() [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] result = hub.switch() [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.greenlet.switch() [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self.f(*self.args, **self.kw) [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] raise exceptions.translate_fault(task_info.error) [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Faults: ['InvalidArgument'] [ 1892.067428] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1892.068793] env[67899]: INFO nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Terminating instance [ 1892.069273] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1892.069491] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1892.069732] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f773b90d-cb5a-45cb-b47a-3d4d2d69c964 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.071925] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1892.072131] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1892.072861] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-698de963-fb20-4590-975b-3822d3c2af4d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.080342] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1892.080647] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-86448d4a-8167-444f-aa76-56f749f437e9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.082742] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1892.082914] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1892.083901] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b06d2135-91a7-41b8-82b9-cedf59c08690 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.088919] env[67899]: DEBUG oslo_vmware.api [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 1892.088919] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52bbe448-ed22-25f4-4fd0-48ab50f9cd2a" [ 1892.088919] env[67899]: _type = "Task" [ 1892.088919] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1892.096227] env[67899]: DEBUG oslo_vmware.api [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52bbe448-ed22-25f4-4fd0-48ab50f9cd2a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1892.150234] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1892.150461] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1892.150630] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Deleting the datastore file [datastore1] 3a077713-f7a2-4a61-bb17-987af6a52c4a {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1892.150890] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c50cb31e-8620-4bb6-bbe6-c5ab23c57055 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.157304] env[67899]: DEBUG oslo_vmware.api [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Waiting for the task: (returnval){ [ 1892.157304] env[67899]: value = "task-3468016" [ 1892.157304] env[67899]: _type = "Task" [ 1892.157304] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1892.164928] env[67899]: DEBUG oslo_vmware.api [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Task: {'id': task-3468016, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1892.599307] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1892.599646] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating directory with path [datastore1] vmware_temp/92d67c57-a947-4c1e-ac5d-194b4b822faf/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1892.599862] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dfffc374-0df3-4809-8ab0-73170d3411f3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.610724] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Created directory with path [datastore1] vmware_temp/92d67c57-a947-4c1e-ac5d-194b4b822faf/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1892.610901] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Fetch image to [datastore1] vmware_temp/92d67c57-a947-4c1e-ac5d-194b4b822faf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1892.611083] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/92d67c57-a947-4c1e-ac5d-194b4b822faf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1892.611761] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9602713-11ee-4e21-b6ce-ff5b57e8604e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.618073] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-458aef85-cf35-4995-84ac-538bbb17162b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.626711] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-486d48b5-b5ff-421a-8d84-b12f45c30d92 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.657311] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f33b34d-aa8c-4743-8f1a-ed8878a428f3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.669396] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9810a797-3bad-4e12-897a-88d43216ec42 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.670966] env[67899]: DEBUG oslo_vmware.api [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Task: {'id': task-3468016, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079415} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1892.671205] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1892.671383] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1892.671548] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1892.671717] env[67899]: INFO nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1892.673686] env[67899]: DEBUG nova.compute.claims [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1892.673862] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1892.674084] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1892.693602] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1892.744695] env[67899]: DEBUG oslo_vmware.rw_handles [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/92d67c57-a947-4c1e-ac5d-194b4b822faf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1892.803129] env[67899]: DEBUG oslo_vmware.rw_handles [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1892.803289] env[67899]: DEBUG oslo_vmware.rw_handles [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/92d67c57-a947-4c1e-ac5d-194b4b822faf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1892.916614] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f41ba5a3-cb21-4c0f-80e9-8d21a077bab5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.924120] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e4fcee5-311a-48a1-8c22-7d9281d0a988 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.952918] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a38f842e-6870-4717-8c95-4fb81bbd4c9f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.959769] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85568943-262e-43e2-b681-7ed6246a6d6c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1892.973514] env[67899]: DEBUG nova.compute.provider_tree [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1892.981813] env[67899]: DEBUG nova.scheduler.client.report [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1892.995433] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.321s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1892.995933] env[67899]: ERROR nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1892.995933] env[67899]: Faults: ['InvalidArgument'] [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Traceback (most recent call last): [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self.driver.spawn(context, instance, image_meta, [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._fetch_image_if_missing(context, vi) [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] image_cache(vi, tmp_image_ds_loc) [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] vm_util.copy_virtual_disk( [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] session._wait_for_task(vmdk_copy_task) [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.wait_for_task(task_ref) [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return evt.wait() [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] result = hub.switch() [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.greenlet.switch() [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self.f(*self.args, **self.kw) [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] raise exceptions.translate_fault(task_info.error) [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Faults: ['InvalidArgument'] [ 1892.995933] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1892.996878] env[67899]: DEBUG nova.compute.utils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1892.997621] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1892.997777] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1892.997894] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1893.000047] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Build of instance 3a077713-f7a2-4a61-bb17-987af6a52c4a was re-scheduled: A specified parameter was not correct: fileType [ 1893.000047] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1893.000047] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1893.000047] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1893.000047] env[67899]: DEBUG nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1893.000330] env[67899]: DEBUG nova.network.neutron [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1893.015610] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1893.015765] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1893.015927] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1893.016025] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1893.016148] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1893.016267] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1893.016384] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1893.016549] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1893.016663] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1893.016724] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1893.107632] env[67899]: DEBUG neutronclient.v2_0.client [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67899) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1893.108689] env[67899]: ERROR nova.compute.manager [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Traceback (most recent call last): [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self.driver.spawn(context, instance, image_meta, [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._fetch_image_if_missing(context, vi) [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] image_cache(vi, tmp_image_ds_loc) [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] vm_util.copy_virtual_disk( [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] session._wait_for_task(vmdk_copy_task) [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.wait_for_task(task_ref) [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return evt.wait() [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] result = hub.switch() [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.greenlet.switch() [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self.f(*self.args, **self.kw) [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] raise exceptions.translate_fault(task_info.error) [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Faults: ['InvalidArgument'] [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] During handling of the above exception, another exception occurred: [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Traceback (most recent call last): [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._build_and_run_instance(context, instance, image, [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] raise exception.RescheduledException( [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] nova.exception.RescheduledException: Build of instance 3a077713-f7a2-4a61-bb17-987af6a52c4a was re-scheduled: A specified parameter was not correct: fileType [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Faults: ['InvalidArgument'] [ 1893.108689] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] During handling of the above exception, another exception occurred: [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Traceback (most recent call last): [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] exception_handler_v20(status_code, error_body) [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] raise client_exc(message=error_message, [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Neutron server returns request_ids: ['req-7e92b81b-fc18-4c8f-a4a5-91dc84fce2a9'] [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] During handling of the above exception, another exception occurred: [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Traceback (most recent call last): [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._deallocate_network(context, instance, requested_networks) [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self.network_api.deallocate_for_instance( [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] data = neutron.list_ports(**search_opts) [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.list('ports', self.ports_path, retrieve_all, [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] for r in self._pagination(collection, path, **params): [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] res = self.get(path, params=params) [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.retry_request("GET", action, body=body, [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1893.109927] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.do_request(method, action, body=body, [ 1893.111179] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.111179] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.111179] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1893.111179] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._handle_fault_response(status_code, replybody, resp) [ 1893.111179] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1893.111179] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] raise exception.Unauthorized() [ 1893.111179] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] nova.exception.Unauthorized: Not authorized. [ 1893.111179] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.158075] env[67899]: INFO nova.scheduler.client.report [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Deleted allocations for instance 3a077713-f7a2-4a61-bb17-987af6a52c4a [ 1893.174351] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0ad992bc-0784-48e6-aae6-156834e2439f tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 633.226s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1893.175482] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 437.655s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1893.175702] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Acquiring lock "3a077713-f7a2-4a61-bb17-987af6a52c4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1893.175901] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1893.176075] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1893.177899] env[67899]: INFO nova.compute.manager [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Terminating instance [ 1893.179443] env[67899]: DEBUG nova.compute.manager [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1893.179633] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1893.180112] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f557fd66-ffe9-4163-99f3-d0033967d096 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1893.189764] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-656d5bec-76de-43e3-899d-533933300c4e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1893.201090] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1893.221604] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3a077713-f7a2-4a61-bb17-987af6a52c4a could not be found. [ 1893.221852] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1893.221984] env[67899]: INFO nova.compute.manager [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1893.222253] env[67899]: DEBUG oslo.service.loopingcall [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1893.222494] env[67899]: DEBUG nova.compute.manager [-] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1893.222592] env[67899]: DEBUG nova.network.neutron [-] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1893.250093] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1893.250364] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1893.251856] env[67899]: INFO nova.compute.claims [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1893.333566] env[67899]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67899) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1893.333772] env[67899]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-6d1dbb54-5d80-43f1-be1b-6db0abdf8dea'] [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1893.334224] env[67899]: ERROR oslo.service.loopingcall [ 1893.335682] env[67899]: ERROR nova.compute.manager [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1893.366931] env[67899]: ERROR nova.compute.manager [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Traceback (most recent call last): [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] exception_handler_v20(status_code, error_body) [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] raise client_exc(message=error_message, [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Neutron server returns request_ids: ['req-6d1dbb54-5d80-43f1-be1b-6db0abdf8dea'] [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] During handling of the above exception, another exception occurred: [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Traceback (most recent call last): [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._delete_instance(context, instance, bdms) [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._shutdown_instance(context, instance, bdms) [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._try_deallocate_network(context, instance, requested_networks) [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] with excutils.save_and_reraise_exception(): [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self.force_reraise() [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] raise self.value [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] _deallocate_network_with_retries() [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return evt.wait() [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] result = hub.switch() [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.greenlet.switch() [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] result = func(*self.args, **self.kw) [ 1893.366931] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] result = f(*args, **kwargs) [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._deallocate_network( [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self.network_api.deallocate_for_instance( [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] data = neutron.list_ports(**search_opts) [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.list('ports', self.ports_path, retrieve_all, [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] for r in self._pagination(collection, path, **params): [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] res = self.get(path, params=params) [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.retry_request("GET", action, body=body, [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] return self.do_request(method, action, body=body, [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] ret = obj(*args, **kwargs) [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] self._handle_fault_response(status_code, replybody, resp) [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1893.368247] env[67899]: ERROR nova.compute.manager [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] [ 1893.396927] env[67899]: DEBUG oslo_concurrency.lockutils [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.221s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1893.398343] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 117.388s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1893.398539] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1893.398711] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "3a077713-f7a2-4a61-bb17-987af6a52c4a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1893.436639] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0045be62-f7af-459d-8af3-a01a4490edeb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1893.444174] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6c02b27-ed71-44f4-b735-4c26b81c7838 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1893.449557] env[67899]: INFO nova.compute.manager [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] [instance: 3a077713-f7a2-4a61-bb17-987af6a52c4a] Successfully reverted task state from None on failure for instance. [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server [None req-cae3bf71-8cf1-4c24-b78a-0aa3efce6f6a tempest-ListImageFiltersTestJSON-1482543128 tempest-ListImageFiltersTestJSON-1482543128-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-6d1dbb54-5d80-43f1-be1b-6db0abdf8dea'] [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1893.453284] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server raise self.value [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.455025] env[67899]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1893.457121] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1893.457121] env[67899]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1893.457121] env[67899]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1893.457121] env[67899]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1893.457121] env[67899]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1893.457121] env[67899]: ERROR oslo_messaging.rpc.server [ 1893.478183] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e399cf20-1918-42a5-add2-940683e8eafc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1893.485591] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1475bf1a-64d3-4283-bb76-e20344aebbed {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1893.498902] env[67899]: DEBUG nova.compute.provider_tree [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1893.507623] env[67899]: DEBUG nova.scheduler.client.report [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1893.520957] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1893.521441] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1893.555047] env[67899]: DEBUG nova.compute.utils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1893.558026] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1893.558026] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1893.565403] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1893.622421] env[67899]: DEBUG nova.policy [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27cd4ea8990b48be8c1f2455a264a858', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '288109a7b3bf4e3a9628184485e4679b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1893.635384] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1893.658856] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1893.659119] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1893.659303] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1893.659490] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1893.659637] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1893.659783] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1893.659989] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1893.660199] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1893.660329] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1893.660491] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1893.660708] env[67899]: DEBUG nova.virt.hardware [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1893.661601] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dc0c382-01b2-4e96-96d5-41e1e6ca0dfb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1893.670194] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6b684e4-3cf2-42e2-9e0f-fa1e8611da87 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1893.941028] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Successfully created port: 31157ce7-4316-41ff-ae1a-a364e362242d {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1894.497457] env[67899]: DEBUG nova.compute.manager [req-bfdad136-d2a6-4de9-9dce-8278e8415207 req-62a90000-d95e-4209-90da-5590ee095f52 service nova] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Received event network-vif-plugged-31157ce7-4316-41ff-ae1a-a364e362242d {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1894.497632] env[67899]: DEBUG oslo_concurrency.lockutils [req-bfdad136-d2a6-4de9-9dce-8278e8415207 req-62a90000-d95e-4209-90da-5590ee095f52 service nova] Acquiring lock "c17d88cf-69ba-43e9-a672-24503c65e9f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1894.497837] env[67899]: DEBUG oslo_concurrency.lockutils [req-bfdad136-d2a6-4de9-9dce-8278e8415207 req-62a90000-d95e-4209-90da-5590ee095f52 service nova] Lock "c17d88cf-69ba-43e9-a672-24503c65e9f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1894.497998] env[67899]: DEBUG oslo_concurrency.lockutils [req-bfdad136-d2a6-4de9-9dce-8278e8415207 req-62a90000-d95e-4209-90da-5590ee095f52 service nova] Lock "c17d88cf-69ba-43e9-a672-24503c65e9f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1894.498172] env[67899]: DEBUG nova.compute.manager [req-bfdad136-d2a6-4de9-9dce-8278e8415207 req-62a90000-d95e-4209-90da-5590ee095f52 service nova] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] No waiting events found dispatching network-vif-plugged-31157ce7-4316-41ff-ae1a-a364e362242d {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1894.498334] env[67899]: WARNING nova.compute.manager [req-bfdad136-d2a6-4de9-9dce-8278e8415207 req-62a90000-d95e-4209-90da-5590ee095f52 service nova] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Received unexpected event network-vif-plugged-31157ce7-4316-41ff-ae1a-a364e362242d for instance with vm_state building and task_state spawning. [ 1894.550832] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Successfully updated port: 31157ce7-4316-41ff-ae1a-a364e362242d {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1894.564268] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "refresh_cache-c17d88cf-69ba-43e9-a672-24503c65e9f2" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1894.564441] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired lock "refresh_cache-c17d88cf-69ba-43e9-a672-24503c65e9f2" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1894.564635] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1894.797168] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1894.989407] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Updating instance_info_cache with network_info: [{"id": "31157ce7-4316-41ff-ae1a-a364e362242d", "address": "fa:16:3e:79:22:b6", "network": {"id": "26d51439-fee0-42d9-ac79-0e886ae3cf6e", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1360921386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "288109a7b3bf4e3a9628184485e4679b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31157ce7-43", "ovs_interfaceid": "31157ce7-4316-41ff-ae1a-a364e362242d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1894.995628] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1894.995823] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1894.995978] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1895.000564] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Releasing lock "refresh_cache-c17d88cf-69ba-43e9-a672-24503c65e9f2" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1895.000842] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Instance network_info: |[{"id": "31157ce7-4316-41ff-ae1a-a364e362242d", "address": "fa:16:3e:79:22:b6", "network": {"id": "26d51439-fee0-42d9-ac79-0e886ae3cf6e", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1360921386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "288109a7b3bf4e3a9628184485e4679b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31157ce7-43", "ovs_interfaceid": "31157ce7-4316-41ff-ae1a-a364e362242d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1895.001249] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:79:22:b6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '31157ce7-4316-41ff-ae1a-a364e362242d', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1895.008901] env[67899]: DEBUG oslo.service.loopingcall [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1895.009376] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1895.009602] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-55a3ab14-2223-47de-bcf8-91cc0fa0f6cc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.030127] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1895.030127] env[67899]: value = "task-3468017" [ 1895.030127] env[67899]: _type = "Task" [ 1895.030127] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1895.037713] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468017, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1895.541022] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468017, 'name': CreateVM_Task, 'duration_secs': 0.296751} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1895.541022] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1895.547300] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1895.547513] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1895.547874] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1895.548140] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-28bb3126-99bb-426b-8601-b42e38be6420 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.553171] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 1895.553171] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]520b59f7-1589-d6e2-2dde-3780d635197f" [ 1895.553171] env[67899]: _type = "Task" [ 1895.553171] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1895.560964] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]520b59f7-1589-d6e2-2dde-3780d635197f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1895.997088] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1895.997379] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1896.063675] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1896.063892] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1896.064125] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1896.524438] env[67899]: DEBUG nova.compute.manager [req-1dac2c17-e971-4d31-a937-31fbe1a358b0 req-ff3bc6b5-1eef-4920-a67c-01d10f2336ec service nova] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Received event network-changed-31157ce7-4316-41ff-ae1a-a364e362242d {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1896.524754] env[67899]: DEBUG nova.compute.manager [req-1dac2c17-e971-4d31-a937-31fbe1a358b0 req-ff3bc6b5-1eef-4920-a67c-01d10f2336ec service nova] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Refreshing instance network info cache due to event network-changed-31157ce7-4316-41ff-ae1a-a364e362242d. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1896.525101] env[67899]: DEBUG oslo_concurrency.lockutils [req-1dac2c17-e971-4d31-a937-31fbe1a358b0 req-ff3bc6b5-1eef-4920-a67c-01d10f2336ec service nova] Acquiring lock "refresh_cache-c17d88cf-69ba-43e9-a672-24503c65e9f2" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1896.525346] env[67899]: DEBUG oslo_concurrency.lockutils [req-1dac2c17-e971-4d31-a937-31fbe1a358b0 req-ff3bc6b5-1eef-4920-a67c-01d10f2336ec service nova] Acquired lock "refresh_cache-c17d88cf-69ba-43e9-a672-24503c65e9f2" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1896.525619] env[67899]: DEBUG nova.network.neutron [req-1dac2c17-e971-4d31-a937-31fbe1a358b0 req-ff3bc6b5-1eef-4920-a67c-01d10f2336ec service nova] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Refreshing network info cache for port 31157ce7-4316-41ff-ae1a-a364e362242d {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1896.838604] env[67899]: DEBUG nova.network.neutron [req-1dac2c17-e971-4d31-a937-31fbe1a358b0 req-ff3bc6b5-1eef-4920-a67c-01d10f2336ec service nova] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Updated VIF entry in instance network info cache for port 31157ce7-4316-41ff-ae1a-a364e362242d. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1896.838963] env[67899]: DEBUG nova.network.neutron [req-1dac2c17-e971-4d31-a937-31fbe1a358b0 req-ff3bc6b5-1eef-4920-a67c-01d10f2336ec service nova] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Updating instance_info_cache with network_info: [{"id": "31157ce7-4316-41ff-ae1a-a364e362242d", "address": "fa:16:3e:79:22:b6", "network": {"id": "26d51439-fee0-42d9-ac79-0e886ae3cf6e", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1360921386-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "288109a7b3bf4e3a9628184485e4679b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31157ce7-43", "ovs_interfaceid": "31157ce7-4316-41ff-ae1a-a364e362242d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1896.849689] env[67899]: DEBUG oslo_concurrency.lockutils [req-1dac2c17-e971-4d31-a937-31fbe1a358b0 req-ff3bc6b5-1eef-4920-a67c-01d10f2336ec service nova] Releasing lock "refresh_cache-c17d88cf-69ba-43e9-a672-24503c65e9f2" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1896.996505] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1897.008114] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1897.008421] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1897.008522] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1897.008663] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1897.010200] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ce84b47-5f5d-400e-a228-8a6cbb0fdcf8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.019461] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef609a8e-d1e3-4386-bb58-ba297233fcc9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.035992] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f75b682-2313-486d-b593-bfc66fea2c80 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.043494] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9813b08d-70fe-48fa-9188-3eea3d19185c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.073840] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180923MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1897.074036] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1897.074146] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1897.150399] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e179db1d-ee0c-4f47-a958-40dd69209d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.150590] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.150721] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.150844] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.151061] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e08f620d-63a0-45cb-99c6-d9d95c938b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.151108] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.151229] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.151344] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c4fe8b3e-cee1-401b-a26f-907a8de95eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.151456] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a993c6a9-140f-430d-a77e-98c2567bf7af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.151569] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1897.164889] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 483824d1-4994-436a-ba16-12524684405c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1897.165132] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1897.165282] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1897.330014] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddbbbac0-87df-481e-866f-63dec7a8fd33 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.338606] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b375dbc-6ffa-419e-a1da-2a4cdaa01cd0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.373425] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c07b9c70-56e5-4e96-8304-3536908d0333 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.381487] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56f98211-6c93-4c90-8642-f363d21aa697 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.395538] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1897.404763] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1897.418236] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1897.418423] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.344s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1898.419041] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1901.991856] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1939.323066] env[67899]: WARNING oslo_vmware.rw_handles [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1939.323066] env[67899]: ERROR oslo_vmware.rw_handles [ 1939.323066] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/92d67c57-a947-4c1e-ac5d-194b4b822faf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1939.324868] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1939.325113] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Copying Virtual Disk [datastore1] vmware_temp/92d67c57-a947-4c1e-ac5d-194b4b822faf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/92d67c57-a947-4c1e-ac5d-194b4b822faf/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1939.325415] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1920a6f6-cbe6-4adc-8228-66fefe5cca99 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1939.333480] env[67899]: DEBUG oslo_vmware.api [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 1939.333480] env[67899]: value = "task-3468018" [ 1939.333480] env[67899]: _type = "Task" [ 1939.333480] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1939.341214] env[67899]: DEBUG oslo_vmware.api [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': task-3468018, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1939.843482] env[67899]: DEBUG oslo_vmware.exceptions [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1939.843781] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1939.844371] env[67899]: ERROR nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1939.844371] env[67899]: Faults: ['InvalidArgument'] [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Traceback (most recent call last): [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] yield resources [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] self.driver.spawn(context, instance, image_meta, [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] self._fetch_image_if_missing(context, vi) [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] image_cache(vi, tmp_image_ds_loc) [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] vm_util.copy_virtual_disk( [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] session._wait_for_task(vmdk_copy_task) [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] return self.wait_for_task(task_ref) [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] return evt.wait() [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] result = hub.switch() [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] return self.greenlet.switch() [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] self.f(*self.args, **self.kw) [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] raise exceptions.translate_fault(task_info.error) [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Faults: ['InvalidArgument'] [ 1939.844371] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] [ 1939.845463] env[67899]: INFO nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Terminating instance [ 1939.847211] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1939.847423] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1939.848054] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1939.848249] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1939.848477] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-57ad8033-b200-46be-96c5-cb7c2871723d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1939.850763] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab585a75-779a-4d0c-803a-7363f5b8b150 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1939.857557] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1939.857784] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e0ab25fd-8e9c-464b-9729-fbbcaa8afe1f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1939.859980] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1939.860167] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1939.861153] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e6bf3bad-0390-4627-8f8d-32586fda0b77 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1939.865818] env[67899]: DEBUG oslo_vmware.api [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Waiting for the task: (returnval){ [ 1939.865818] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5230fc03-a128-211a-2663-7f89e122988a" [ 1939.865818] env[67899]: _type = "Task" [ 1939.865818] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1939.876140] env[67899]: DEBUG oslo_vmware.api [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5230fc03-a128-211a-2663-7f89e122988a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1939.930371] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1939.930601] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1939.930761] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Deleting the datastore file [datastore1] e179db1d-ee0c-4f47-a958-40dd69209d26 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1939.931055] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b1dc869e-772c-4516-9967-efccfe08976f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1939.937869] env[67899]: DEBUG oslo_vmware.api [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 1939.937869] env[67899]: value = "task-3468020" [ 1939.937869] env[67899]: _type = "Task" [ 1939.937869] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1939.945237] env[67899]: DEBUG oslo_vmware.api [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': task-3468020, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1940.376210] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1940.376526] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Creating directory with path [datastore1] vmware_temp/c98b53fd-5ef3-45ef-b413-16f2210ec889/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1940.376695] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d05d7f7d-0cae-48c2-9707-6249657c0561 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.389148] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Created directory with path [datastore1] vmware_temp/c98b53fd-5ef3-45ef-b413-16f2210ec889/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1940.389341] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Fetch image to [datastore1] vmware_temp/c98b53fd-5ef3-45ef-b413-16f2210ec889/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1940.389512] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/c98b53fd-5ef3-45ef-b413-16f2210ec889/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1940.390235] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ee71eda-0bef-458c-af51-a68878103fc3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.396588] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32af829b-e9a8-4b27-b613-64c822fe095d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.405252] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14483a22-6330-49f6-bec7-57a7c20ac05d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.435361] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5a29f71-3b26-4aeb-9ceb-65a97ca0f39a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.443329] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5b7dafdf-5569-411b-ae87-abd5394d6cac {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.447465] env[67899]: DEBUG oslo_vmware.api [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': task-3468020, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068112} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1940.447976] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1940.448177] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1940.448348] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1940.448517] env[67899]: INFO nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1940.450544] env[67899]: DEBUG nova.compute.claims [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1940.450697] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1940.450912] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1940.467920] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1940.615694] env[67899]: DEBUG oslo_vmware.rw_handles [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c98b53fd-5ef3-45ef-b413-16f2210ec889/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1940.674207] env[67899]: DEBUG oslo_vmware.rw_handles [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1940.674393] env[67899]: DEBUG oslo_vmware.rw_handles [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c98b53fd-5ef3-45ef-b413-16f2210ec889/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1940.682029] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-951f20be-e39b-416b-b878-ebd6e53468ec {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.689599] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45ffcba3-84ef-4328-b772-682c7dbe3227 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.719738] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46ce7d10-165d-4059-9c6b-865a08106103 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.726156] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fc4a145-7193-4b6e-a534-259d9ff49c84 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1940.738505] env[67899]: DEBUG nova.compute.provider_tree [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1940.746867] env[67899]: DEBUG nova.scheduler.client.report [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1940.759860] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.309s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1940.760380] env[67899]: ERROR nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1940.760380] env[67899]: Faults: ['InvalidArgument'] [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Traceback (most recent call last): [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] self.driver.spawn(context, instance, image_meta, [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] self._fetch_image_if_missing(context, vi) [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] image_cache(vi, tmp_image_ds_loc) [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] vm_util.copy_virtual_disk( [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] session._wait_for_task(vmdk_copy_task) [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] return self.wait_for_task(task_ref) [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] return evt.wait() [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] result = hub.switch() [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] return self.greenlet.switch() [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] self.f(*self.args, **self.kw) [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] raise exceptions.translate_fault(task_info.error) [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Faults: ['InvalidArgument'] [ 1940.760380] env[67899]: ERROR nova.compute.manager [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] [ 1940.761405] env[67899]: DEBUG nova.compute.utils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1940.762376] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Build of instance e179db1d-ee0c-4f47-a958-40dd69209d26 was re-scheduled: A specified parameter was not correct: fileType [ 1940.762376] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1940.762768] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1940.762942] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1940.763130] env[67899]: DEBUG nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1940.763291] env[67899]: DEBUG nova.network.neutron [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1941.050665] env[67899]: DEBUG nova.network.neutron [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1941.068081] env[67899]: INFO nova.compute.manager [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Took 0.30 seconds to deallocate network for instance. [ 1941.164582] env[67899]: INFO nova.scheduler.client.report [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Deleted allocations for instance e179db1d-ee0c-4f47-a958-40dd69209d26 [ 1941.184512] env[67899]: DEBUG oslo_concurrency.lockutils [None req-457d9d77-97fa-4b41-8f02-5a00e92d0ccc tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 504.746s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1941.185682] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 308.616s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1941.185909] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "e179db1d-ee0c-4f47-a958-40dd69209d26-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1941.186136] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1941.186304] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1941.188961] env[67899]: INFO nova.compute.manager [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Terminating instance [ 1941.190642] env[67899]: DEBUG nova.compute.manager [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1941.190850] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1941.191663] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-611e9cf4-b177-4764-a6f7-71ea6075c56f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1941.197989] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1941.203243] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df546bb3-acb5-46b1-bad1-dce6c5c361fb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1941.232619] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e179db1d-ee0c-4f47-a958-40dd69209d26 could not be found. [ 1941.232855] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1941.233052] env[67899]: INFO nova.compute.manager [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1941.233300] env[67899]: DEBUG oslo.service.loopingcall [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1941.234076] env[67899]: DEBUG nova.compute.manager [-] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1941.234185] env[67899]: DEBUG nova.network.neutron [-] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1941.249629] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1941.249861] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1941.251385] env[67899]: INFO nova.compute.claims [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1941.263568] env[67899]: DEBUG nova.network.neutron [-] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1941.281614] env[67899]: INFO nova.compute.manager [-] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] Took 0.05 seconds to deallocate network for instance. [ 1941.370362] env[67899]: DEBUG oslo_concurrency.lockutils [None req-eee1c1c4-6225-4c50-80b1-0adb0bc6bd79 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1941.371250] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 165.361s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1941.371437] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e179db1d-ee0c-4f47-a958-40dd69209d26] During sync_power_state the instance has a pending task (deleting). Skip. [ 1941.371606] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "e179db1d-ee0c-4f47-a958-40dd69209d26" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1941.428157] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d554913b-5c81-4381-bfca-c0d3737c64f3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1941.435549] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7d22c6a-5148-41e6-89bb-7671e178843d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1941.464430] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34529b7f-fe2e-41d9-9346-06674d26265d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1941.470878] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e2c873d-0ca3-4632-b48d-e23047b3337f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1941.483379] env[67899]: DEBUG nova.compute.provider_tree [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1941.494576] env[67899]: DEBUG nova.scheduler.client.report [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1941.507814] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1941.508283] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1941.539390] env[67899]: DEBUG nova.compute.utils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1941.540777] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1941.540946] env[67899]: DEBUG nova.network.neutron [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1941.550631] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1941.619527] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1941.639431] env[67899]: DEBUG nova.policy [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ae302ed41614521a1a97b4c607a9eee', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a918aafa0191456bba21e2a0fda8d3c9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 1941.647088] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1941.647329] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1941.647485] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1941.647667] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1941.647866] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1941.648030] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1941.648239] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1941.648397] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1941.648574] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1941.648736] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1941.648906] env[67899]: DEBUG nova.virt.hardware [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1941.649788] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e909cfb6-347f-4cc6-825c-c7e94c1744f8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1941.657513] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bf54334-69ed-4821-83ee-05d6af10e26f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1941.971799] env[67899]: DEBUG nova.network.neutron [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Successfully created port: d5b83abf-0121-4461-88af-fd98e5f2225d {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1942.640729] env[67899]: DEBUG nova.network.neutron [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Successfully updated port: d5b83abf-0121-4461-88af-fd98e5f2225d {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1942.652186] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "refresh_cache-483824d1-4994-436a-ba16-12524684405c" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1942.652493] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquired lock "refresh_cache-483824d1-4994-436a-ba16-12524684405c" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1942.652586] env[67899]: DEBUG nova.network.neutron [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1942.876186] env[67899]: DEBUG nova.network.neutron [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1943.053403] env[67899]: DEBUG nova.network.neutron [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Updating instance_info_cache with network_info: [{"id": "d5b83abf-0121-4461-88af-fd98e5f2225d", "address": "fa:16:3e:c7:8d:ec", "network": {"id": "6b50a822-7305-45f0-bf1e-da3ad38b5edb", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1869812463-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a918aafa0191456bba21e2a0fda8d3c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9f208df-1fb5-4403-9796-7fd19e4bfb85", "external-id": "cl2-zone-400", "segmentation_id": 400, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5b83abf-01", "ovs_interfaceid": "d5b83abf-0121-4461-88af-fd98e5f2225d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1943.064303] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Releasing lock "refresh_cache-483824d1-4994-436a-ba16-12524684405c" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1943.064560] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Instance network_info: |[{"id": "d5b83abf-0121-4461-88af-fd98e5f2225d", "address": "fa:16:3e:c7:8d:ec", "network": {"id": "6b50a822-7305-45f0-bf1e-da3ad38b5edb", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1869812463-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a918aafa0191456bba21e2a0fda8d3c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9f208df-1fb5-4403-9796-7fd19e4bfb85", "external-id": "cl2-zone-400", "segmentation_id": 400, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5b83abf-01", "ovs_interfaceid": "d5b83abf-0121-4461-88af-fd98e5f2225d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1943.064938] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c7:8d:ec', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c9f208df-1fb5-4403-9796-7fd19e4bfb85', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd5b83abf-0121-4461-88af-fd98e5f2225d', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1943.072612] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Creating folder: Project (a918aafa0191456bba21e2a0fda8d3c9). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1943.072995] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e34bec66-f072-43f5-9d39-96afae1334a3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1943.085465] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Created folder: Project (a918aafa0191456bba21e2a0fda8d3c9) in parent group-v692900. [ 1943.085665] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Creating folder: Instances. Parent ref: group-v693011. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1943.085887] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-43b04871-7ee9-4991-be54-e163aa9dc536 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1943.089332] env[67899]: DEBUG nova.compute.manager [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] [instance: 483824d1-4994-436a-ba16-12524684405c] Received event network-vif-plugged-d5b83abf-0121-4461-88af-fd98e5f2225d {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1943.089528] env[67899]: DEBUG oslo_concurrency.lockutils [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] Acquiring lock "483824d1-4994-436a-ba16-12524684405c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1943.089748] env[67899]: DEBUG oslo_concurrency.lockutils [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] Lock "483824d1-4994-436a-ba16-12524684405c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1943.089936] env[67899]: DEBUG oslo_concurrency.lockutils [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] Lock "483824d1-4994-436a-ba16-12524684405c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1943.090117] env[67899]: DEBUG nova.compute.manager [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] [instance: 483824d1-4994-436a-ba16-12524684405c] No waiting events found dispatching network-vif-plugged-d5b83abf-0121-4461-88af-fd98e5f2225d {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1943.090280] env[67899]: WARNING nova.compute.manager [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] [instance: 483824d1-4994-436a-ba16-12524684405c] Received unexpected event network-vif-plugged-d5b83abf-0121-4461-88af-fd98e5f2225d for instance with vm_state building and task_state spawning. [ 1943.090433] env[67899]: DEBUG nova.compute.manager [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] [instance: 483824d1-4994-436a-ba16-12524684405c] Received event network-changed-d5b83abf-0121-4461-88af-fd98e5f2225d {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1943.090648] env[67899]: DEBUG nova.compute.manager [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] [instance: 483824d1-4994-436a-ba16-12524684405c] Refreshing instance network info cache due to event network-changed-d5b83abf-0121-4461-88af-fd98e5f2225d. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1943.090848] env[67899]: DEBUG oslo_concurrency.lockutils [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] Acquiring lock "refresh_cache-483824d1-4994-436a-ba16-12524684405c" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1943.091020] env[67899]: DEBUG oslo_concurrency.lockutils [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] Acquired lock "refresh_cache-483824d1-4994-436a-ba16-12524684405c" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1943.092242] env[67899]: DEBUG nova.network.neutron [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] [instance: 483824d1-4994-436a-ba16-12524684405c] Refreshing network info cache for port d5b83abf-0121-4461-88af-fd98e5f2225d {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1943.102300] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Created folder: Instances in parent group-v693011. [ 1943.102527] env[67899]: DEBUG oslo.service.loopingcall [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1943.103048] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 483824d1-4994-436a-ba16-12524684405c] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1943.103258] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b055b396-5897-456a-84b4-f98678fe6dad {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1943.122051] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1943.122051] env[67899]: value = "task-3468023" [ 1943.122051] env[67899]: _type = "Task" [ 1943.122051] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1943.129306] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468023, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1943.477600] env[67899]: DEBUG nova.network.neutron [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] [instance: 483824d1-4994-436a-ba16-12524684405c] Updated VIF entry in instance network info cache for port d5b83abf-0121-4461-88af-fd98e5f2225d. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1943.477972] env[67899]: DEBUG nova.network.neutron [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] [instance: 483824d1-4994-436a-ba16-12524684405c] Updating instance_info_cache with network_info: [{"id": "d5b83abf-0121-4461-88af-fd98e5f2225d", "address": "fa:16:3e:c7:8d:ec", "network": {"id": "6b50a822-7305-45f0-bf1e-da3ad38b5edb", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1869812463-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a918aafa0191456bba21e2a0fda8d3c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9f208df-1fb5-4403-9796-7fd19e4bfb85", "external-id": "cl2-zone-400", "segmentation_id": 400, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5b83abf-01", "ovs_interfaceid": "d5b83abf-0121-4461-88af-fd98e5f2225d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1943.487434] env[67899]: DEBUG oslo_concurrency.lockutils [req-42b09749-0f48-4980-bd96-952bfb485303 req-e8edc659-9135-4921-9f02-43d651e6f838 service nova] Releasing lock "refresh_cache-483824d1-4994-436a-ba16-12524684405c" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1943.632222] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468023, 'name': CreateVM_Task, 'duration_secs': 0.292817} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1943.632398] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 483824d1-4994-436a-ba16-12524684405c] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1943.633110] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1943.633282] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1943.633605] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1943.633872] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8d436d53-e35f-4b21-9614-eb9ba106e99d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1943.638555] env[67899]: DEBUG oslo_vmware.api [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Waiting for the task: (returnval){ [ 1943.638555] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52116708-830c-d508-f882-8e6406f672b1" [ 1943.638555] env[67899]: _type = "Task" [ 1943.638555] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1943.645960] env[67899]: DEBUG oslo_vmware.api [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52116708-830c-d508-f882-8e6406f672b1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1944.149887] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1944.150151] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1944.150364] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1949.014101] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1951.996480] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1953.997057] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1953.997414] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1953.997414] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1954.018865] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.019064] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.019206] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.019335] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.019456] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.019573] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.019688] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.019802] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.019916] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.020037] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 483824d1-4994-436a-ba16-12524684405c] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1954.020158] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1955.996387] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1955.996740] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1956.996878] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1956.997333] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1957.010316] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1957.010519] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1957.010927] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1957.010927] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1957.011915] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd5ffec3-0550-4b9d-a7c4-f2d66fa3d2e0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.020479] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ced9b96-921c-49b1-9b21-4785fa74e00f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.035141] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ddd5a94-b10b-4e20-b410-df9f9cbdcc6a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.041813] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16f6cb8a-9312-459c-bc98-fa91aa36b2a9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.071965] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180928MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1957.072174] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1957.072328] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1957.147502] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.147668] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.147788] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.147899] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e08f620d-63a0-45cb-99c6-d9d95c938b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.148031] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.148153] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.148268] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c4fe8b3e-cee1-401b-a26f-907a8de95eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.148385] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a993c6a9-140f-430d-a77e-98c2567bf7af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.148499] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.148611] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 483824d1-4994-436a-ba16-12524684405c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1957.148809] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1957.148959] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1957.283940] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e833f63-6d1c-48e1-ab56-5deb818f6809 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.290508] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9aa851a-814b-4504-ab62-8667ab879b1a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.322639] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-280900c2-a647-46fa-bef2-50890f30ee5a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.330356] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93639064-7022-4196-8f62-3c9dbb589baf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.343455] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1957.352305] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1957.368891] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1957.369105] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1958.368837] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1958.369126] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1958.369268] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1988.486043] env[67899]: WARNING oslo_vmware.rw_handles [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1988.486043] env[67899]: ERROR oslo_vmware.rw_handles [ 1988.486880] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/c98b53fd-5ef3-45ef-b413-16f2210ec889/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1988.488322] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1988.488556] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Copying Virtual Disk [datastore1] vmware_temp/c98b53fd-5ef3-45ef-b413-16f2210ec889/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/c98b53fd-5ef3-45ef-b413-16f2210ec889/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1988.488835] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-90d4ed62-4fb5-40d2-93e2-c7a1317421cc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1988.496802] env[67899]: DEBUG oslo_vmware.api [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Waiting for the task: (returnval){ [ 1988.496802] env[67899]: value = "task-3468024" [ 1988.496802] env[67899]: _type = "Task" [ 1988.496802] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1988.504425] env[67899]: DEBUG oslo_vmware.api [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Task: {'id': task-3468024, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1989.007864] env[67899]: DEBUG oslo_vmware.exceptions [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1989.008190] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1989.008866] env[67899]: ERROR nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1989.008866] env[67899]: Faults: ['InvalidArgument'] [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Traceback (most recent call last): [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] yield resources [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] self.driver.spawn(context, instance, image_meta, [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] self._fetch_image_if_missing(context, vi) [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] image_cache(vi, tmp_image_ds_loc) [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] vm_util.copy_virtual_disk( [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] session._wait_for_task(vmdk_copy_task) [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] return self.wait_for_task(task_ref) [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] return evt.wait() [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] result = hub.switch() [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] return self.greenlet.switch() [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] self.f(*self.args, **self.kw) [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] raise exceptions.translate_fault(task_info.error) [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Faults: ['InvalidArgument'] [ 1989.008866] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] [ 1989.009935] env[67899]: INFO nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Terminating instance [ 1989.010809] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1989.011062] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1989.011367] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ba6b72c3-cd16-476a-81ad-162dcdeb7a1e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.013540] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1989.013730] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1989.014474] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edc08aa4-e06a-42c8-afe4-977377001ec2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.021401] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1989.021519] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-02cd2fee-7d50-4de1-b952-168fd2c3ba60 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.023610] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1989.023792] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1989.024776] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bb1febfe-4fa8-40a9-9756-fa7e0e39b38e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.029297] env[67899]: DEBUG oslo_vmware.api [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 1989.029297] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52786d68-4e82-e144-0adb-56f644fb7940" [ 1989.029297] env[67899]: _type = "Task" [ 1989.029297] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1989.036364] env[67899]: DEBUG oslo_vmware.api [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52786d68-4e82-e144-0adb-56f644fb7940, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1989.096240] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1989.096533] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1989.096628] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Deleting the datastore file [datastore1] addcc88a-6bb5-4a70-938e-49c0c79c8414 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1989.096896] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5ca402a0-e92a-4316-9533-0e1d1073c2ee {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.103907] env[67899]: DEBUG oslo_vmware.api [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Waiting for the task: (returnval){ [ 1989.103907] env[67899]: value = "task-3468026" [ 1989.103907] env[67899]: _type = "Task" [ 1989.103907] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1989.112740] env[67899]: DEBUG oslo_vmware.api [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Task: {'id': task-3468026, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1989.540476] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1989.540892] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating directory with path [datastore1] vmware_temp/ca5addf5-5419-414d-ae06-bbe4882ca215/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1989.540993] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b565c1a8-eb32-4053-9321-674854d60f22 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.551832] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Created directory with path [datastore1] vmware_temp/ca5addf5-5419-414d-ae06-bbe4882ca215/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1989.552019] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Fetch image to [datastore1] vmware_temp/ca5addf5-5419-414d-ae06-bbe4882ca215/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1989.552199] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/ca5addf5-5419-414d-ae06-bbe4882ca215/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1989.552897] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67a2f305-a403-4ed9-a7d6-776681057ac1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.559257] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c826f43b-1412-4cab-871d-ec2cc8b05332 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.568132] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f7ab841-54c8-4155-85f7-7246fd007c9b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.597650] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03439f9a-8a70-4b5d-ae32-b3ff625fdfa7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.602949] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a16ac72c-21ee-4ff0-a4e6-33d19aff93aa {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.611502] env[67899]: DEBUG oslo_vmware.api [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Task: {'id': task-3468026, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074596} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1989.611727] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1989.611902] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1989.612105] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1989.612307] env[67899]: INFO nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1989.614413] env[67899]: DEBUG nova.compute.claims [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1989.614667] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1989.614784] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1989.627027] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1989.675398] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ca5addf5-5419-414d-ae06-bbe4882ca215/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1989.735176] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1989.735366] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ca5addf5-5419-414d-ae06-bbe4882ca215/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1989.823141] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-900a7fb1-834b-43d0-acb3-f6c9cf398310 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.830522] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c07a118-ab63-459f-91ac-cb0991210772 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.861230] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ab88481-40c3-40de-a6d2-6d2a3b6b5f15 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.867973] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08b6b953-dc77-45da-a507-cad59dd901ad {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1989.880755] env[67899]: DEBUG nova.compute.provider_tree [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1989.889261] env[67899]: DEBUG nova.scheduler.client.report [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1989.904688] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.290s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1989.905244] env[67899]: ERROR nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1989.905244] env[67899]: Faults: ['InvalidArgument'] [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Traceback (most recent call last): [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] self.driver.spawn(context, instance, image_meta, [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] self._fetch_image_if_missing(context, vi) [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] image_cache(vi, tmp_image_ds_loc) [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] vm_util.copy_virtual_disk( [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] session._wait_for_task(vmdk_copy_task) [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] return self.wait_for_task(task_ref) [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] return evt.wait() [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] result = hub.switch() [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] return self.greenlet.switch() [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] self.f(*self.args, **self.kw) [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] raise exceptions.translate_fault(task_info.error) [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Faults: ['InvalidArgument'] [ 1989.905244] env[67899]: ERROR nova.compute.manager [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] [ 1989.906263] env[67899]: DEBUG nova.compute.utils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1989.907282] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Build of instance addcc88a-6bb5-4a70-938e-49c0c79c8414 was re-scheduled: A specified parameter was not correct: fileType [ 1989.907282] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1989.907643] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1989.907811] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1989.907980] env[67899]: DEBUG nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1989.908163] env[67899]: DEBUG nova.network.neutron [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1990.277375] env[67899]: DEBUG nova.network.neutron [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1990.292812] env[67899]: INFO nova.compute.manager [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Took 0.38 seconds to deallocate network for instance. [ 1990.382172] env[67899]: INFO nova.scheduler.client.report [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Deleted allocations for instance addcc88a-6bb5-4a70-938e-49c0c79c8414 [ 1990.404697] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0034688e-73b1-4561-afa0-7f613b171593 tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 549.338s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1990.404977] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 354.185s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1990.405223] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Acquiring lock "addcc88a-6bb5-4a70-938e-49c0c79c8414-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1990.405450] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1990.405630] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1990.407746] env[67899]: INFO nova.compute.manager [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Terminating instance [ 1990.409900] env[67899]: DEBUG nova.compute.manager [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1990.410112] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1990.410657] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5a3dc540-9e86-45a4-bc80-7f689c0ae809 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.419781] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e52cdef-7625-4494-ad00-4de9b37b1891 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.448599] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance addcc88a-6bb5-4a70-938e-49c0c79c8414 could not be found. [ 1990.448814] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1990.448995] env[67899]: INFO nova.compute.manager [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1990.449559] env[67899]: DEBUG oslo.service.loopingcall [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1990.449800] env[67899]: DEBUG nova.compute.manager [-] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1990.449906] env[67899]: DEBUG nova.network.neutron [-] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1990.482065] env[67899]: DEBUG nova.network.neutron [-] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1990.490036] env[67899]: INFO nova.compute.manager [-] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] Took 0.04 seconds to deallocate network for instance. [ 1990.573945] env[67899]: DEBUG oslo_concurrency.lockutils [None req-6aa5208c-9ac9-45f2-9084-015d9cab363a tempest-ServerDiskConfigTestJSON-1424063644 tempest-ServerDiskConfigTestJSON-1424063644-project-member] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1990.574842] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 214.564s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1990.575089] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: addcc88a-6bb5-4a70-938e-49c0c79c8414] During sync_power_state the instance has a pending task (deleting). Skip. [ 1990.575291] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "addcc88a-6bb5-4a70-938e-49c0c79c8414" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2010.993894] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2013.998594] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2015.996969] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2015.997301] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2015.997301] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2016.023929] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2016.023929] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2016.024277] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2016.024277] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2016.024277] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2016.024479] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2016.024654] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2016.024654] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2016.024754] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 483824d1-4994-436a-ba16-12524684405c] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2016.024983] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2016.025469] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2016.025648] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2016.996244] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2016.996511] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2017.009382] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2017.009715] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2017.009777] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2017.009914] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2017.015165] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b933887-0acf-431d-b4e8-acd0f8dc09f7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.024666] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f510b62-b3ef-48c2-b14d-07a813f0cf33 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.039762] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-183f9b27-2cd9-4410-90da-383c6aab4fb8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.047672] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28124f40-0c47-4c49-a202-8d1d9decf2ef {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.082359] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180936MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2017.082799] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2017.083151] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2017.172664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a6544af8-879d-4c45-bee4-8551b861fc66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2017.172664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2017.172664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e08f620d-63a0-45cb-99c6-d9d95c938b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2017.172664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2017.172664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2017.172664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c4fe8b3e-cee1-401b-a26f-907a8de95eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2017.172664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a993c6a9-140f-430d-a77e-98c2567bf7af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2017.172664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2017.172664] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 483824d1-4994-436a-ba16-12524684405c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2017.185882] env[67899]: INFO nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cc1164c7-82bb-4d80-89ad-e9ba5658d9c8 has allocations against this compute host but is not found in the database. [ 2017.186370] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2017.186638] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2017.252699] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2017.253107] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Lock "cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2017.268632] env[67899]: DEBUG nova.compute.manager [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2017.335473] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2017.350102] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcd2fa33-4ac9-47e1-97a3-15df226604d5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.358640] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4e2bab4-bae5-45aa-b3e5-15bb004bce8a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.389500] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c1ca81a-d072-4165-a481-ca329acffdc8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.397651] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e4fdc83-aa21-4835-acee-7a33264afb31 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.412642] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2017.421543] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2017.436052] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2017.436254] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.353s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2017.436520] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.101s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2017.438056] env[67899]: INFO nova.compute.claims [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2017.618207] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-292432c0-64f6-4b62-9844-78a9c0c90c02 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.627350] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a39e2f1-9921-46d7-82ae-1be830ff22f9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.659496] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79e12f11-8dfe-4edb-96e4-c11f35bb41a2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.666943] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1deba8fe-3792-444c-99db-41d26fc9adca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.680195] env[67899]: DEBUG nova.compute.provider_tree [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2017.689153] env[67899]: DEBUG nova.scheduler.client.report [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2017.706156] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2017.706687] env[67899]: DEBUG nova.compute.manager [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2017.741371] env[67899]: DEBUG nova.compute.utils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2017.742861] env[67899]: DEBUG nova.compute.manager [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2017.743043] env[67899]: DEBUG nova.network.neutron [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2017.752801] env[67899]: DEBUG nova.compute.manager [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2017.809365] env[67899]: DEBUG nova.policy [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b4ef04cb3bbd4881a51534fe50b18a95', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b1d299af0314e7e87698f444649de1c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 2017.818614] env[67899]: DEBUG nova.compute.manager [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2017.845575] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2017.845817] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2017.845948] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2017.846141] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2017.846287] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2017.846434] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2017.846642] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2017.846797] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2017.846972] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2017.847153] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2017.847325] env[67899]: DEBUG nova.virt.hardware [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2017.848497] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bb1c787-8c1b-481b-8110-3ce805a0a6c6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2017.856744] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86b29398-c879-4c6c-aced-2d08b3360907 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2018.160370] env[67899]: DEBUG nova.network.neutron [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Successfully created port: 4b596ad5-5bfe-4496-9193-8cf5350f5e5d {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2018.770497] env[67899]: DEBUG nova.compute.manager [req-11fd5817-3f18-4ba9-882b-3b893a119fa7 req-8aaf2ce0-4937-4928-bd77-894ff6afb229 service nova] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Received event network-vif-plugged-4b596ad5-5bfe-4496-9193-8cf5350f5e5d {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2018.770720] env[67899]: DEBUG oslo_concurrency.lockutils [req-11fd5817-3f18-4ba9-882b-3b893a119fa7 req-8aaf2ce0-4937-4928-bd77-894ff6afb229 service nova] Acquiring lock "cc1164c7-82bb-4d80-89ad-e9ba5658d9c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2018.770930] env[67899]: DEBUG oslo_concurrency.lockutils [req-11fd5817-3f18-4ba9-882b-3b893a119fa7 req-8aaf2ce0-4937-4928-bd77-894ff6afb229 service nova] Lock "cc1164c7-82bb-4d80-89ad-e9ba5658d9c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2018.771128] env[67899]: DEBUG oslo_concurrency.lockutils [req-11fd5817-3f18-4ba9-882b-3b893a119fa7 req-8aaf2ce0-4937-4928-bd77-894ff6afb229 service nova] Lock "cc1164c7-82bb-4d80-89ad-e9ba5658d9c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2018.771329] env[67899]: DEBUG nova.compute.manager [req-11fd5817-3f18-4ba9-882b-3b893a119fa7 req-8aaf2ce0-4937-4928-bd77-894ff6afb229 service nova] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] No waiting events found dispatching network-vif-plugged-4b596ad5-5bfe-4496-9193-8cf5350f5e5d {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2018.771501] env[67899]: WARNING nova.compute.manager [req-11fd5817-3f18-4ba9-882b-3b893a119fa7 req-8aaf2ce0-4937-4928-bd77-894ff6afb229 service nova] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Received unexpected event network-vif-plugged-4b596ad5-5bfe-4496-9193-8cf5350f5e5d for instance with vm_state building and task_state spawning. [ 2018.853807] env[67899]: DEBUG nova.network.neutron [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Successfully updated port: 4b596ad5-5bfe-4496-9193-8cf5350f5e5d {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2018.864431] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "refresh_cache-cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2018.864577] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquired lock "refresh_cache-cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2018.864832] env[67899]: DEBUG nova.network.neutron [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2018.901764] env[67899]: DEBUG nova.network.neutron [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2019.070843] env[67899]: DEBUG nova.network.neutron [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Updating instance_info_cache with network_info: [{"id": "4b596ad5-5bfe-4496-9193-8cf5350f5e5d", "address": "fa:16:3e:0c:89:8b", "network": {"id": "98c45d1b-ff40-4055-be6f-b92512acc582", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1056955687-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b1d299af0314e7e87698f444649de1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae18b41f-e73c-44f1-83dd-467c080944f4", "external-id": "nsx-vlan-transportzone-653", "segmentation_id": 653, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b596ad5-5b", "ovs_interfaceid": "4b596ad5-5bfe-4496-9193-8cf5350f5e5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2019.080913] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Releasing lock "refresh_cache-cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2019.081199] env[67899]: DEBUG nova.compute.manager [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Instance network_info: |[{"id": "4b596ad5-5bfe-4496-9193-8cf5350f5e5d", "address": "fa:16:3e:0c:89:8b", "network": {"id": "98c45d1b-ff40-4055-be6f-b92512acc582", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1056955687-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b1d299af0314e7e87698f444649de1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae18b41f-e73c-44f1-83dd-467c080944f4", "external-id": "nsx-vlan-transportzone-653", "segmentation_id": 653, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b596ad5-5b", "ovs_interfaceid": "4b596ad5-5bfe-4496-9193-8cf5350f5e5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2019.081591] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0c:89:8b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ae18b41f-e73c-44f1-83dd-467c080944f4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4b596ad5-5bfe-4496-9193-8cf5350f5e5d', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2019.089077] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Creating folder: Project (3b1d299af0314e7e87698f444649de1c). Parent ref: group-v692900. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2019.089551] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-97b3565c-05dc-4494-a79a-52df3ed1d86a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2019.101899] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Created folder: Project (3b1d299af0314e7e87698f444649de1c) in parent group-v692900. [ 2019.102052] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Creating folder: Instances. Parent ref: group-v693014. {{(pid=67899) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2019.102267] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a74009a2-954e-46bb-9a7a-383845fa0d10 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2019.111039] env[67899]: INFO nova.virt.vmwareapi.vm_util [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Created folder: Instances in parent group-v693014. [ 2019.111260] env[67899]: DEBUG oslo.service.loopingcall [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2019.111435] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2019.111614] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3e923ab0-d9ac-4ffc-8526-a195d92e222b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2019.130096] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2019.130096] env[67899]: value = "task-3468029" [ 2019.130096] env[67899]: _type = "Task" [ 2019.130096] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2019.137062] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468029, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2019.440053] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2019.640138] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468029, 'name': CreateVM_Task, 'duration_secs': 0.285814} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2019.640319] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2019.640972] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2019.641155] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2019.641472] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2019.641715] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-148e6bcc-2e22-43b4-a498-76b03389f81c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2019.645845] env[67899]: DEBUG oslo_vmware.api [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Waiting for the task: (returnval){ [ 2019.645845] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d6c5d3-3a37-5f0c-da63-a6532ac289f8" [ 2019.645845] env[67899]: _type = "Task" [ 2019.645845] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2019.652870] env[67899]: DEBUG oslo_vmware.api [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d6c5d3-3a37-5f0c-da63-a6532ac289f8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2019.996540] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2019.996732] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2020.157944] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2020.158215] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2020.158426] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2020.794535] env[67899]: DEBUG nova.compute.manager [req-aa13f065-256c-430f-8308-af4edd39a3bc req-08c03246-1a8d-4a29-95ea-e273e08235c2 service nova] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Received event network-changed-4b596ad5-5bfe-4496-9193-8cf5350f5e5d {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2020.794771] env[67899]: DEBUG nova.compute.manager [req-aa13f065-256c-430f-8308-af4edd39a3bc req-08c03246-1a8d-4a29-95ea-e273e08235c2 service nova] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Refreshing instance network info cache due to event network-changed-4b596ad5-5bfe-4496-9193-8cf5350f5e5d. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2020.794965] env[67899]: DEBUG oslo_concurrency.lockutils [req-aa13f065-256c-430f-8308-af4edd39a3bc req-08c03246-1a8d-4a29-95ea-e273e08235c2 service nova] Acquiring lock "refresh_cache-cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2020.795132] env[67899]: DEBUG oslo_concurrency.lockutils [req-aa13f065-256c-430f-8308-af4edd39a3bc req-08c03246-1a8d-4a29-95ea-e273e08235c2 service nova] Acquired lock "refresh_cache-cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2020.795294] env[67899]: DEBUG nova.network.neutron [req-aa13f065-256c-430f-8308-af4edd39a3bc req-08c03246-1a8d-4a29-95ea-e273e08235c2 service nova] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Refreshing network info cache for port 4b596ad5-5bfe-4496-9193-8cf5350f5e5d {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2021.023904] env[67899]: DEBUG nova.network.neutron [req-aa13f065-256c-430f-8308-af4edd39a3bc req-08c03246-1a8d-4a29-95ea-e273e08235c2 service nova] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Updated VIF entry in instance network info cache for port 4b596ad5-5bfe-4496-9193-8cf5350f5e5d. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2021.024275] env[67899]: DEBUG nova.network.neutron [req-aa13f065-256c-430f-8308-af4edd39a3bc req-08c03246-1a8d-4a29-95ea-e273e08235c2 service nova] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Updating instance_info_cache with network_info: [{"id": "4b596ad5-5bfe-4496-9193-8cf5350f5e5d", "address": "fa:16:3e:0c:89:8b", "network": {"id": "98c45d1b-ff40-4055-be6f-b92512acc582", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1056955687-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b1d299af0314e7e87698f444649de1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae18b41f-e73c-44f1-83dd-467c080944f4", "external-id": "nsx-vlan-transportzone-653", "segmentation_id": 653, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b596ad5-5b", "ovs_interfaceid": "4b596ad5-5bfe-4496-9193-8cf5350f5e5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2021.032992] env[67899]: DEBUG oslo_concurrency.lockutils [req-aa13f065-256c-430f-8308-af4edd39a3bc req-08c03246-1a8d-4a29-95ea-e273e08235c2 service nova] Releasing lock "refresh_cache-cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2022.991965] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2025.293188] env[67899]: DEBUG oslo_concurrency.lockutils [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2030.976697] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "c4fe8b3e-cee1-401b-a26f-907a8de95eba" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2036.365557] env[67899]: DEBUG oslo_concurrency.lockutils [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "c17d88cf-69ba-43e9-a672-24503c65e9f2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2036.427083] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "a993c6a9-140f-430d-a77e-98c2567bf7af" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2037.342503] env[67899]: WARNING oslo_vmware.rw_handles [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2037.342503] env[67899]: ERROR oslo_vmware.rw_handles [ 2037.343974] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/ca5addf5-5419-414d-ae06-bbe4882ca215/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2037.345514] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2037.345514] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Copying Virtual Disk [datastore1] vmware_temp/ca5addf5-5419-414d-ae06-bbe4882ca215/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/ca5addf5-5419-414d-ae06-bbe4882ca215/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2037.345514] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-624a338c-8ed4-47b1-a3d2-475217866a58 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2037.354266] env[67899]: DEBUG oslo_vmware.api [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 2037.354266] env[67899]: value = "task-3468030" [ 2037.354266] env[67899]: _type = "Task" [ 2037.354266] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2037.362591] env[67899]: DEBUG oslo_vmware.api [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': task-3468030, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2037.864694] env[67899]: DEBUG oslo_vmware.exceptions [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2037.865112] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2037.865572] env[67899]: ERROR nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2037.865572] env[67899]: Faults: ['InvalidArgument'] [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Traceback (most recent call last): [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] yield resources [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] self.driver.spawn(context, instance, image_meta, [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] self._fetch_image_if_missing(context, vi) [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] image_cache(vi, tmp_image_ds_loc) [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] vm_util.copy_virtual_disk( [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] session._wait_for_task(vmdk_copy_task) [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] return self.wait_for_task(task_ref) [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] return evt.wait() [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] result = hub.switch() [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] return self.greenlet.switch() [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] self.f(*self.args, **self.kw) [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] raise exceptions.translate_fault(task_info.error) [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Faults: ['InvalidArgument'] [ 2037.865572] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] [ 2037.866687] env[67899]: INFO nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Terminating instance [ 2037.867388] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2037.867599] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2037.867855] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-85a939ed-7e4a-44e8-a3ff-3634a864cae1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2037.871271] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2037.871465] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2037.872180] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c7fb7e5-a04c-4a6c-ab54-e5c600c0149a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2037.875554] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2037.875729] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2037.876676] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fd41d40e-6772-4b4c-9cf0-580ef6b39b4f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2037.880416] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2037.880880] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b8064002-bf20-4711-8808-15568815725f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2037.883071] env[67899]: DEBUG oslo_vmware.api [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 2037.883071] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]521d1a71-6d2d-d527-ccf8-3a12cda7b5d6" [ 2037.883071] env[67899]: _type = "Task" [ 2037.883071] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2037.890181] env[67899]: DEBUG oslo_vmware.api [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]521d1a71-6d2d-d527-ccf8-3a12cda7b5d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2037.986385] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2037.986610] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2037.986856] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Deleting the datastore file [datastore1] a6544af8-879d-4c45-bee4-8551b861fc66 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2037.987136] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7364cd83-0817-4be8-a5fc-0414c9ecda3c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2037.993442] env[67899]: DEBUG oslo_vmware.api [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 2037.993442] env[67899]: value = "task-3468032" [ 2037.993442] env[67899]: _type = "Task" [ 2037.993442] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2038.000846] env[67899]: DEBUG oslo_vmware.api [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': task-3468032, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2038.393717] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2038.393967] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating directory with path [datastore1] vmware_temp/6756a114-3472-4ad7-ad21-4a1e7dbb0d40/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2038.394218] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3e5a51b3-966c-421e-a761-301b657a98cc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.409198] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Created directory with path [datastore1] vmware_temp/6756a114-3472-4ad7-ad21-4a1e7dbb0d40/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2038.409391] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Fetch image to [datastore1] vmware_temp/6756a114-3472-4ad7-ad21-4a1e7dbb0d40/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2038.409592] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/6756a114-3472-4ad7-ad21-4a1e7dbb0d40/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2038.410309] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-441f117c-1bcd-45e0-a42e-cb18fcd808b6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.416886] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a324c69d-a339-471d-918b-f95d355aee6a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.425746] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5816c5d1-455b-47a4-bb1a-35d1b24ab467 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.458187] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a92aa385-6170-4a3e-9581-46d41ae5a1fd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.463981] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-eac5ed03-ba69-44c8-98fb-d0bb122f536a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.492714] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2038.505016] env[67899]: DEBUG oslo_vmware.api [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': task-3468032, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06921} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2038.506446] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2038.506446] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2038.506446] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2038.506446] env[67899]: INFO nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Took 0.63 seconds to destroy the instance on the hypervisor. [ 2038.510202] env[67899]: DEBUG nova.compute.claims [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2038.510491] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2038.510739] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2038.552325] env[67899]: DEBUG oslo_vmware.rw_handles [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6756a114-3472-4ad7-ad21-4a1e7dbb0d40/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2038.614265] env[67899]: DEBUG oslo_vmware.rw_handles [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2038.614456] env[67899]: DEBUG oslo_vmware.rw_handles [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6756a114-3472-4ad7-ad21-4a1e7dbb0d40/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2038.743597] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-786abb7a-ef76-4e9d-b18e-a2f23006e116 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.751709] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f983cc65-43d4-408e-957f-e44d37a87592 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.780821] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7dd23ea-2eb6-407d-88f0-046913434280 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.788017] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8cb6720-da14-4181-9f27-d23ed0258c2a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.800649] env[67899]: DEBUG nova.compute.provider_tree [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2038.809343] env[67899]: DEBUG nova.scheduler.client.report [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2038.823729] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.313s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2038.824263] env[67899]: ERROR nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2038.824263] env[67899]: Faults: ['InvalidArgument'] [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Traceback (most recent call last): [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] self.driver.spawn(context, instance, image_meta, [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] self._fetch_image_if_missing(context, vi) [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] image_cache(vi, tmp_image_ds_loc) [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] vm_util.copy_virtual_disk( [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] session._wait_for_task(vmdk_copy_task) [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] return self.wait_for_task(task_ref) [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] return evt.wait() [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] result = hub.switch() [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] return self.greenlet.switch() [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] self.f(*self.args, **self.kw) [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] raise exceptions.translate_fault(task_info.error) [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Faults: ['InvalidArgument'] [ 2038.824263] env[67899]: ERROR nova.compute.manager [instance: a6544af8-879d-4c45-bee4-8551b861fc66] [ 2038.824987] env[67899]: DEBUG nova.compute.utils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2038.826355] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Build of instance a6544af8-879d-4c45-bee4-8551b861fc66 was re-scheduled: A specified parameter was not correct: fileType [ 2038.826355] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2038.826732] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2038.826903] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2038.827085] env[67899]: DEBUG nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2038.827251] env[67899]: DEBUG nova.network.neutron [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2039.084242] env[67899]: DEBUG nova.network.neutron [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2039.099261] env[67899]: INFO nova.compute.manager [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Took 0.27 seconds to deallocate network for instance. [ 2039.198058] env[67899]: INFO nova.scheduler.client.report [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Deleted allocations for instance a6544af8-879d-4c45-bee4-8551b861fc66 [ 2039.226895] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a9f9c866-17da-47e5-9277-2051f1ca187a tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a6544af8-879d-4c45-bee4-8551b861fc66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 592.436s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2039.227067] env[67899]: DEBUG oslo_concurrency.lockutils [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a6544af8-879d-4c45-bee4-8551b861fc66" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 395.980s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2039.227254] env[67899]: DEBUG oslo_concurrency.lockutils [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "a6544af8-879d-4c45-bee4-8551b861fc66-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2039.228191] env[67899]: DEBUG oslo_concurrency.lockutils [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a6544af8-879d-4c45-bee4-8551b861fc66-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2039.228191] env[67899]: DEBUG oslo_concurrency.lockutils [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a6544af8-879d-4c45-bee4-8551b861fc66-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2039.229874] env[67899]: INFO nova.compute.manager [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Terminating instance [ 2039.233533] env[67899]: DEBUG nova.compute.manager [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2039.233533] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2039.233533] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0a536e15-2570-4b09-907d-ae2af066277e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.241547] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7e520b0-93cf-457c-a89f-e8293ef1c743 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.271299] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a6544af8-879d-4c45-bee4-8551b861fc66 could not be found. [ 2039.271623] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2039.271754] env[67899]: INFO nova.compute.manager [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2039.271991] env[67899]: DEBUG oslo.service.loopingcall [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2039.272241] env[67899]: DEBUG nova.compute.manager [-] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2039.272338] env[67899]: DEBUG nova.network.neutron [-] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2039.297326] env[67899]: DEBUG nova.network.neutron [-] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2039.306932] env[67899]: INFO nova.compute.manager [-] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] Took 0.03 seconds to deallocate network for instance. [ 2039.414279] env[67899]: DEBUG oslo_concurrency.lockutils [None req-17faa3a7-1a7c-45ab-ae27-d83f65917918 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a6544af8-879d-4c45-bee4-8551b861fc66" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.187s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2039.415115] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "a6544af8-879d-4c45-bee4-8551b861fc66" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 263.404s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2039.415301] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a6544af8-879d-4c45-bee4-8551b861fc66] During sync_power_state the instance has a pending task (deleting). Skip. [ 2039.415473] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "a6544af8-879d-4c45-bee4-8551b861fc66" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2048.635989] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "483824d1-4994-436a-ba16-12524684405c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2061.996537] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2073.000597] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2073.359623] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "fc98bb10-8fe8-4203-80b8-9885b2c302c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2073.359859] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "fc98bb10-8fe8-4203-80b8-9885b2c302c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2073.371166] env[67899]: DEBUG nova.compute.manager [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2073.419629] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2073.419884] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2073.421275] env[67899]: INFO nova.compute.claims [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2073.578513] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1501719c-cadf-4550-872f-eab903b09ca1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2073.586029] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b252037-66fa-41c9-bbcd-a6eb0f8aa2ae {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2073.616831] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-370eee2b-2905-4d60-9449-dbb57d37b46f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2073.624988] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13dfa4f7-d2f6-45c7-aaa0-65f7a4fa282e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2073.639680] env[67899]: DEBUG nova.compute.provider_tree [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2073.648103] env[67899]: DEBUG nova.scheduler.client.report [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2073.661997] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2073.662455] env[67899]: DEBUG nova.compute.manager [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2073.693199] env[67899]: DEBUG nova.compute.utils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2073.694700] env[67899]: DEBUG nova.compute.manager [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2073.694879] env[67899]: DEBUG nova.network.neutron [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2073.702960] env[67899]: DEBUG nova.compute.manager [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2073.756461] env[67899]: DEBUG nova.policy [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '061d2e2c56824c0886656625babbf20f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93f5a8c99daa4c85bd8edffb5c6dd338', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 2073.763249] env[67899]: DEBUG nova.compute.manager [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2073.788103] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2073.788354] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2073.788517] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2073.788695] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2073.788835] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2073.788980] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2073.789196] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2073.789350] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2073.789523] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2073.789677] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2073.789843] env[67899]: DEBUG nova.virt.hardware [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2073.790694] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4af864b9-62bd-4686-962f-f9fedda2e555 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2073.798457] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4d80148-9205-40e2-ba5e-cd0d010fd2c3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2073.996637] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2074.052531] env[67899]: DEBUG nova.network.neutron [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Successfully created port: 7782f1ae-c7d5-424e-8c9f-4872c7adc440 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2074.616043] env[67899]: DEBUG nova.compute.manager [req-38a8df45-8c6e-4a36-b56b-d0bba51cf78f req-e2619400-0697-43f0-a442-c1a18f30243d service nova] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Received event network-vif-plugged-7782f1ae-c7d5-424e-8c9f-4872c7adc440 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2074.616298] env[67899]: DEBUG oslo_concurrency.lockutils [req-38a8df45-8c6e-4a36-b56b-d0bba51cf78f req-e2619400-0697-43f0-a442-c1a18f30243d service nova] Acquiring lock "fc98bb10-8fe8-4203-80b8-9885b2c302c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2074.616531] env[67899]: DEBUG oslo_concurrency.lockutils [req-38a8df45-8c6e-4a36-b56b-d0bba51cf78f req-e2619400-0697-43f0-a442-c1a18f30243d service nova] Lock "fc98bb10-8fe8-4203-80b8-9885b2c302c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2074.616732] env[67899]: DEBUG oslo_concurrency.lockutils [req-38a8df45-8c6e-4a36-b56b-d0bba51cf78f req-e2619400-0697-43f0-a442-c1a18f30243d service nova] Lock "fc98bb10-8fe8-4203-80b8-9885b2c302c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2074.616922] env[67899]: DEBUG nova.compute.manager [req-38a8df45-8c6e-4a36-b56b-d0bba51cf78f req-e2619400-0697-43f0-a442-c1a18f30243d service nova] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] No waiting events found dispatching network-vif-plugged-7782f1ae-c7d5-424e-8c9f-4872c7adc440 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2074.617102] env[67899]: WARNING nova.compute.manager [req-38a8df45-8c6e-4a36-b56b-d0bba51cf78f req-e2619400-0697-43f0-a442-c1a18f30243d service nova] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Received unexpected event network-vif-plugged-7782f1ae-c7d5-424e-8c9f-4872c7adc440 for instance with vm_state building and task_state spawning. [ 2074.697331] env[67899]: DEBUG nova.network.neutron [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Successfully updated port: 7782f1ae-c7d5-424e-8c9f-4872c7adc440 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2074.706754] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "refresh_cache-fc98bb10-8fe8-4203-80b8-9885b2c302c1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2074.706916] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquired lock "refresh_cache-fc98bb10-8fe8-4203-80b8-9885b2c302c1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2074.707107] env[67899]: DEBUG nova.network.neutron [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2074.750083] env[67899]: DEBUG nova.network.neutron [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2074.902269] env[67899]: DEBUG nova.network.neutron [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Updating instance_info_cache with network_info: [{"id": "7782f1ae-c7d5-424e-8c9f-4872c7adc440", "address": "fa:16:3e:86:92:56", "network": {"id": "9f5d5406-2587-426e-93b4-3d172c8ac117", "bridge": "br-int", "label": "tempest-ServersTestJSON-457264438-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "93f5a8c99daa4c85bd8edffb5c6dd338", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "abcf0d10-3f3f-45dc-923e-1c78766e2dad", "external-id": "nsx-vlan-transportzone-405", "segmentation_id": 405, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7782f1ae-c7", "ovs_interfaceid": "7782f1ae-c7d5-424e-8c9f-4872c7adc440", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2074.915109] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Releasing lock "refresh_cache-fc98bb10-8fe8-4203-80b8-9885b2c302c1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2074.915391] env[67899]: DEBUG nova.compute.manager [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Instance network_info: |[{"id": "7782f1ae-c7d5-424e-8c9f-4872c7adc440", "address": "fa:16:3e:86:92:56", "network": {"id": "9f5d5406-2587-426e-93b4-3d172c8ac117", "bridge": "br-int", "label": "tempest-ServersTestJSON-457264438-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "93f5a8c99daa4c85bd8edffb5c6dd338", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "abcf0d10-3f3f-45dc-923e-1c78766e2dad", "external-id": "nsx-vlan-transportzone-405", "segmentation_id": 405, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7782f1ae-c7", "ovs_interfaceid": "7782f1ae-c7d5-424e-8c9f-4872c7adc440", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2074.915799] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:86:92:56', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'abcf0d10-3f3f-45dc-923e-1c78766e2dad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7782f1ae-c7d5-424e-8c9f-4872c7adc440', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2074.923156] env[67899]: DEBUG oslo.service.loopingcall [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2074.923617] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2074.923861] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1fef00d9-7c2a-4a1a-ba4c-ceb7bcefd1b5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.943740] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2074.943740] env[67899]: value = "task-3468033" [ 2074.943740] env[67899]: _type = "Task" [ 2074.943740] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2074.950824] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468033, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2075.454860] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468033, 'name': CreateVM_Task, 'duration_secs': 0.301679} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2075.454860] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2075.454860] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2075.455235] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2075.455351] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2075.455803] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eb98e719-22f5-4735-bcb2-db5b00d40ad0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2075.460214] env[67899]: DEBUG oslo_vmware.api [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Waiting for the task: (returnval){ [ 2075.460214] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5273156b-f0d1-2650-389a-ef6cb03f7d8a" [ 2075.460214] env[67899]: _type = "Task" [ 2075.460214] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2075.468814] env[67899]: DEBUG oslo_vmware.api [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5273156b-f0d1-2650-389a-ef6cb03f7d8a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2075.970599] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2075.970837] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2075.971061] env[67899]: DEBUG oslo_concurrency.lockutils [None req-9f31b163-09ed-4bf3-8e2e-26dc9db0ca2f tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2075.996599] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2076.650418] env[67899]: DEBUG nova.compute.manager [req-77661f26-4d35-48d6-bfe0-c24dfece4d12 req-521f21c7-a70f-4524-a2d2-2b21fb516e2e service nova] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Received event network-changed-7782f1ae-c7d5-424e-8c9f-4872c7adc440 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2076.650418] env[67899]: DEBUG nova.compute.manager [req-77661f26-4d35-48d6-bfe0-c24dfece4d12 req-521f21c7-a70f-4524-a2d2-2b21fb516e2e service nova] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Refreshing instance network info cache due to event network-changed-7782f1ae-c7d5-424e-8c9f-4872c7adc440. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2076.650418] env[67899]: DEBUG oslo_concurrency.lockutils [req-77661f26-4d35-48d6-bfe0-c24dfece4d12 req-521f21c7-a70f-4524-a2d2-2b21fb516e2e service nova] Acquiring lock "refresh_cache-fc98bb10-8fe8-4203-80b8-9885b2c302c1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2076.650418] env[67899]: DEBUG oslo_concurrency.lockutils [req-77661f26-4d35-48d6-bfe0-c24dfece4d12 req-521f21c7-a70f-4524-a2d2-2b21fb516e2e service nova] Acquired lock "refresh_cache-fc98bb10-8fe8-4203-80b8-9885b2c302c1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2076.650418] env[67899]: DEBUG nova.network.neutron [req-77661f26-4d35-48d6-bfe0-c24dfece4d12 req-521f21c7-a70f-4524-a2d2-2b21fb516e2e service nova] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Refreshing network info cache for port 7782f1ae-c7d5-424e-8c9f-4872c7adc440 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2076.914892] env[67899]: DEBUG nova.network.neutron [req-77661f26-4d35-48d6-bfe0-c24dfece4d12 req-521f21c7-a70f-4524-a2d2-2b21fb516e2e service nova] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Updated VIF entry in instance network info cache for port 7782f1ae-c7d5-424e-8c9f-4872c7adc440. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2076.915245] env[67899]: DEBUG nova.network.neutron [req-77661f26-4d35-48d6-bfe0-c24dfece4d12 req-521f21c7-a70f-4524-a2d2-2b21fb516e2e service nova] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Updating instance_info_cache with network_info: [{"id": "7782f1ae-c7d5-424e-8c9f-4872c7adc440", "address": "fa:16:3e:86:92:56", "network": {"id": "9f5d5406-2587-426e-93b4-3d172c8ac117", "bridge": "br-int", "label": "tempest-ServersTestJSON-457264438-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "93f5a8c99daa4c85bd8edffb5c6dd338", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "abcf0d10-3f3f-45dc-923e-1c78766e2dad", "external-id": "nsx-vlan-transportzone-405", "segmentation_id": 405, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7782f1ae-c7", "ovs_interfaceid": "7782f1ae-c7d5-424e-8c9f-4872c7adc440", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2076.927010] env[67899]: DEBUG oslo_concurrency.lockutils [req-77661f26-4d35-48d6-bfe0-c24dfece4d12 req-521f21c7-a70f-4524-a2d2-2b21fb516e2e service nova] Releasing lock "refresh_cache-fc98bb10-8fe8-4203-80b8-9885b2c302c1" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2077.996444] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2077.996789] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2077.996789] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2078.017859] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.017992] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.018141] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.018269] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.018391] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.018514] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.018632] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.018754] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 483824d1-4994-436a-ba16-12524684405c] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.018871] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.018988] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2078.019118] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2078.019569] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2078.019753] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2078.030235] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2078.030438] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2078.030592] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2078.030739] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2078.031816] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24f02f20-3f19-4a75-92d0-5d3149a88872 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.040114] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99e24134-142f-49fe-8470-d30940d0f829 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.053586] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd0c5227-857b-420f-a66f-738daa4415d0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.059580] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c57f2541-93ce-489d-9690-f593b80328a5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.089441] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180928MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2078.089441] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2078.089619] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2078.192269] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.192441] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance e08f620d-63a0-45cb-99c6-d9d95c938b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.192570] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.192693] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.192815] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c4fe8b3e-cee1-401b-a26f-907a8de95eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.192934] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a993c6a9-140f-430d-a77e-98c2567bf7af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.193064] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.193181] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 483824d1-4994-436a-ba16-12524684405c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.193296] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cc1164c7-82bb-4d80-89ad-e9ba5658d9c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.193409] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance fc98bb10-8fe8-4203-80b8-9885b2c302c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2078.193627] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2078.193795] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2078.209557] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing inventories for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2078.222910] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating ProviderTree inventory for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2078.222910] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating inventory in ProviderTree for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2078.232779] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing aggregate associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, aggregates: None {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2078.249824] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing trait associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, traits: COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2078.366161] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77af8b13-1f61-437c-ae02-e60bd29fe394 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.373929] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0a059f7-d92b-424f-89be-cc2fd23f313d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.402705] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7f44de0-adae-441f-b7c3-426397d46999 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.409812] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e60ffc0b-1923-4f8f-9238-0853fe319dbd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.422743] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2078.431401] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2078.445626] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2078.445807] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.356s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2079.422864] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2079.423206] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2080.996100] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2080.996380] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2080.996519] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2080.996647] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2081.005765] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] There are 0 instances to clean {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2087.361246] env[67899]: WARNING oslo_vmware.rw_handles [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2087.361246] env[67899]: ERROR oslo_vmware.rw_handles [ 2087.361992] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/6756a114-3472-4ad7-ad21-4a1e7dbb0d40/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2087.363904] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2087.364187] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Copying Virtual Disk [datastore1] vmware_temp/6756a114-3472-4ad7-ad21-4a1e7dbb0d40/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/6756a114-3472-4ad7-ad21-4a1e7dbb0d40/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2087.366175] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-85d8fbd0-443b-49e0-85f6-2bf2a8224233 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2087.374432] env[67899]: DEBUG oslo_vmware.api [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 2087.374432] env[67899]: value = "task-3468034" [ 2087.374432] env[67899]: _type = "Task" [ 2087.374432] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2087.381846] env[67899]: DEBUG oslo_vmware.api [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': task-3468034, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2087.884895] env[67899]: DEBUG oslo_vmware.exceptions [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2087.884895] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2087.885453] env[67899]: ERROR nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2087.885453] env[67899]: Faults: ['InvalidArgument'] [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Traceback (most recent call last): [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] yield resources [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] self.driver.spawn(context, instance, image_meta, [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] self._fetch_image_if_missing(context, vi) [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] image_cache(vi, tmp_image_ds_loc) [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] vm_util.copy_virtual_disk( [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] session._wait_for_task(vmdk_copy_task) [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] return self.wait_for_task(task_ref) [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] return evt.wait() [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] result = hub.switch() [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] return self.greenlet.switch() [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] self.f(*self.args, **self.kw) [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] raise exceptions.translate_fault(task_info.error) [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Faults: ['InvalidArgument'] [ 2087.885453] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] [ 2087.886421] env[67899]: INFO nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Terminating instance [ 2087.887268] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2087.887465] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2087.887691] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d6f50421-2caf-4ece-bb7e-8eefc0f48490 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2087.889915] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2087.890107] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2087.890792] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df404dde-c6a2-4e0b-8e1d-428ea2957953 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2087.897298] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2087.897510] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cd935d71-a29b-4fd4-87e8-59f84b3fd1db {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2087.899631] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2087.899804] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2087.900724] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3f082774-b6d4-4a52-a8bf-db8c902effe9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2087.905586] env[67899]: DEBUG oslo_vmware.api [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Waiting for the task: (returnval){ [ 2087.905586] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5294818b-c055-c76e-25c6-222bc0a4265a" [ 2087.905586] env[67899]: _type = "Task" [ 2087.905586] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2087.912306] env[67899]: DEBUG oslo_vmware.api [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5294818b-c055-c76e-25c6-222bc0a4265a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2087.966675] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2087.966914] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2087.967098] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Deleting the datastore file [datastore1] 9b4a7c14-84dc-4222-a758-3f8f10e23b7a {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2087.967379] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d99bb02b-3373-49f3-ba31-f47e0dd68c5a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2087.974014] env[67899]: DEBUG oslo_vmware.api [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for the task: (returnval){ [ 2087.974014] env[67899]: value = "task-3468036" [ 2087.974014] env[67899]: _type = "Task" [ 2087.974014] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2087.981869] env[67899]: DEBUG oslo_vmware.api [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': task-3468036, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2088.415671] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2088.415933] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Creating directory with path [datastore1] vmware_temp/74371260-2b45-48c0-9993-d96fc4046f31/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2088.416190] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7db005e9-2acd-4cbc-97ef-2744ed135510 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.426652] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Created directory with path [datastore1] vmware_temp/74371260-2b45-48c0-9993-d96fc4046f31/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2088.426787] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Fetch image to [datastore1] vmware_temp/74371260-2b45-48c0-9993-d96fc4046f31/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2088.426954] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/74371260-2b45-48c0-9993-d96fc4046f31/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2088.427671] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fe2da68-88e2-4bb9-876f-1ca5bb61f79a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.434100] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24bc2ea7-438b-4d4f-8483-38690299d558 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.443083] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0653f5dc-03d9-4068-b674-c47c445e497d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.473665] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef6aad0f-b797-41b4-b93f-03cc3d6f1174 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.483530] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e5ab5344-da96-4644-94b2-0894db1f87c7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.485156] env[67899]: DEBUG oslo_vmware.api [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Task: {'id': task-3468036, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076042} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2088.485387] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2088.485560] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2088.485724] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2088.485895] env[67899]: INFO nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2088.488282] env[67899]: DEBUG nova.compute.claims [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2088.488449] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2088.488655] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2088.504826] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2088.560273] env[67899]: DEBUG oslo_vmware.rw_handles [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74371260-2b45-48c0-9993-d96fc4046f31/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2088.621151] env[67899]: DEBUG oslo_vmware.rw_handles [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2088.621435] env[67899]: DEBUG oslo_vmware.rw_handles [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74371260-2b45-48c0-9993-d96fc4046f31/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2088.704830] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0df2794-4327-4472-a60d-c530d3df65de {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.712309] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60fe5965-f1a4-46a2-b2eb-0514f0849a5d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.741479] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2e88374-3eca-4472-8eba-be5eb7e12a08 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.748274] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc2000d6-edb0-4fdc-bbb6-8218aba84da1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.760853] env[67899]: DEBUG nova.compute.provider_tree [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2088.768845] env[67899]: DEBUG nova.scheduler.client.report [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2088.782857] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.294s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2088.783367] env[67899]: ERROR nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2088.783367] env[67899]: Faults: ['InvalidArgument'] [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Traceback (most recent call last): [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] self.driver.spawn(context, instance, image_meta, [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] self._fetch_image_if_missing(context, vi) [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] image_cache(vi, tmp_image_ds_loc) [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] vm_util.copy_virtual_disk( [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] session._wait_for_task(vmdk_copy_task) [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] return self.wait_for_task(task_ref) [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] return evt.wait() [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] result = hub.switch() [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] return self.greenlet.switch() [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] self.f(*self.args, **self.kw) [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] raise exceptions.translate_fault(task_info.error) [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Faults: ['InvalidArgument'] [ 2088.783367] env[67899]: ERROR nova.compute.manager [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] [ 2088.784130] env[67899]: DEBUG nova.compute.utils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2088.785385] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Build of instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a was re-scheduled: A specified parameter was not correct: fileType [ 2088.785385] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2088.785755] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2088.785923] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2088.786103] env[67899]: DEBUG nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2088.786265] env[67899]: DEBUG nova.network.neutron [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2089.088556] env[67899]: DEBUG nova.network.neutron [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2089.104385] env[67899]: INFO nova.compute.manager [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Took 0.32 seconds to deallocate network for instance. [ 2089.195500] env[67899]: INFO nova.scheduler.client.report [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Deleted allocations for instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a [ 2089.216639] env[67899]: DEBUG oslo_concurrency.lockutils [None req-bb2aacf0-6d67-4744-ac3b-ad8f8e88f68e tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 634.555s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2089.217329] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 438.568s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2089.217564] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2089.217774] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2089.217946] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2089.220168] env[67899]: INFO nova.compute.manager [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Terminating instance [ 2089.222427] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquiring lock "refresh_cache-9b4a7c14-84dc-4222-a758-3f8f10e23b7a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2089.222749] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Acquired lock "refresh_cache-9b4a7c14-84dc-4222-a758-3f8f10e23b7a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2089.222749] env[67899]: DEBUG nova.network.neutron [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2089.252172] env[67899]: DEBUG nova.network.neutron [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2089.352632] env[67899]: DEBUG nova.network.neutron [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2089.361196] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Releasing lock "refresh_cache-9b4a7c14-84dc-4222-a758-3f8f10e23b7a" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2089.361568] env[67899]: DEBUG nova.compute.manager [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2089.361832] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2089.362371] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-414f6c15-5a51-4b23-9dac-09f6cfea7c6e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.371368] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7ea2bcb-49af-4043-8093-b6a8776eb35d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.398825] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9b4a7c14-84dc-4222-a758-3f8f10e23b7a could not be found. [ 2089.399021] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2089.399197] env[67899]: INFO nova.compute.manager [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2089.399431] env[67899]: DEBUG oslo.service.loopingcall [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2089.399649] env[67899]: DEBUG nova.compute.manager [-] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2089.399748] env[67899]: DEBUG nova.network.neutron [-] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2089.415830] env[67899]: DEBUG nova.network.neutron [-] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2089.422609] env[67899]: DEBUG nova.network.neutron [-] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2089.430335] env[67899]: INFO nova.compute.manager [-] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] Took 0.03 seconds to deallocate network for instance. [ 2089.511391] env[67899]: DEBUG oslo_concurrency.lockutils [None req-d7480f5c-af4c-48d1-b988-79e9468de4ba tempest-ImagesTestJSON-610999726 tempest-ImagesTestJSON-610999726-project-member] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.294s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2089.512341] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 313.501s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2089.512603] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 9b4a7c14-84dc-4222-a758-3f8f10e23b7a] During sync_power_state the instance has a pending task (deleting). Skip. [ 2089.512802] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "9b4a7c14-84dc-4222-a758-3f8f10e23b7a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2091.997066] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2091.997264] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances with incomplete migration {{(pid=67899) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2133.001475] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2134.996864] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2136.797109] env[67899]: WARNING oslo_vmware.rw_handles [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2136.797109] env[67899]: ERROR oslo_vmware.rw_handles [ 2136.797109] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/74371260-2b45-48c0-9993-d96fc4046f31/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2136.799453] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2136.799774] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Copying Virtual Disk [datastore1] vmware_temp/74371260-2b45-48c0-9993-d96fc4046f31/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/74371260-2b45-48c0-9993-d96fc4046f31/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2136.800155] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-98f3b237-5073-44f1-bfe4-80cf6b594788 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2136.808889] env[67899]: DEBUG oslo_vmware.api [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Waiting for the task: (returnval){ [ 2136.808889] env[67899]: value = "task-3468037" [ 2136.808889] env[67899]: _type = "Task" [ 2136.808889] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2136.818240] env[67899]: DEBUG oslo_vmware.api [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Task: {'id': task-3468037, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2136.996351] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2137.319797] env[67899]: DEBUG oslo_vmware.exceptions [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2137.320126] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2137.320700] env[67899]: ERROR nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2137.320700] env[67899]: Faults: ['InvalidArgument'] [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Traceback (most recent call last): [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] yield resources [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] self.driver.spawn(context, instance, image_meta, [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] self._fetch_image_if_missing(context, vi) [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] image_cache(vi, tmp_image_ds_loc) [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] vm_util.copy_virtual_disk( [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] session._wait_for_task(vmdk_copy_task) [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] return self.wait_for_task(task_ref) [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] return evt.wait() [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] result = hub.switch() [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] return self.greenlet.switch() [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] self.f(*self.args, **self.kw) [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] raise exceptions.translate_fault(task_info.error) [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Faults: ['InvalidArgument'] [ 2137.320700] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] [ 2137.321761] env[67899]: INFO nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Terminating instance [ 2137.322609] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2137.322821] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2137.323091] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-008b7474-07f5-4c27-9b58-3327633215bd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.326079] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2137.326187] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2137.327030] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84add57d-277d-477b-9efa-5d39feabebd2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.331052] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2137.331227] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2137.332278] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5976e35c-eda8-454e-ba36-7e0d1be16f56 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.337261] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2137.337261] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0b7fa672-47a9-4414-99be-5fbb9d8e7b08 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.339885] env[67899]: DEBUG oslo_vmware.api [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Waiting for the task: (returnval){ [ 2137.339885] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]522f9c19-be8b-2fed-490d-5358c7c567d7" [ 2137.339885] env[67899]: _type = "Task" [ 2137.339885] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2137.347922] env[67899]: DEBUG oslo_vmware.api [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]522f9c19-be8b-2fed-490d-5358c7c567d7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2137.409628] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2137.409936] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2137.410204] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Deleting the datastore file [datastore1] e08f620d-63a0-45cb-99c6-d9d95c938b38 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2137.410564] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1989edac-83c2-4166-b882-07d78099153b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.417857] env[67899]: DEBUG oslo_vmware.api [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Waiting for the task: (returnval){ [ 2137.417857] env[67899]: value = "task-3468039" [ 2137.417857] env[67899]: _type = "Task" [ 2137.417857] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2137.425896] env[67899]: DEBUG oslo_vmware.api [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Task: {'id': task-3468039, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2137.854828] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2137.855194] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Creating directory with path [datastore1] vmware_temp/5f9e7c63-b2c7-4b0d-a0e6-0d539d5735b4/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2137.855513] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0f43f7a2-40d6-49a7-aa7b-64e76907be4e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.868333] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Created directory with path [datastore1] vmware_temp/5f9e7c63-b2c7-4b0d-a0e6-0d539d5735b4/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2137.868656] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Fetch image to [datastore1] vmware_temp/5f9e7c63-b2c7-4b0d-a0e6-0d539d5735b4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2137.868695] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/5f9e7c63-b2c7-4b0d-a0e6-0d539d5735b4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2137.869421] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9220aa65-1177-4a76-80b3-58edf9874600 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.875881] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcd72d3d-bd51-4312-a702-866fd5519511 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.884705] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c51a362f-a66e-4574-8b6a-a7b1e3ecc7af {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.915611] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1628d280-7e7d-42e6-ab01-9bcd9590d3c2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.927018] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1a0ebc75-9a24-4e97-ab3c-7cadc16f3d4e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2137.928837] env[67899]: DEBUG oslo_vmware.api [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Task: {'id': task-3468039, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082029} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2137.929108] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2137.929287] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2137.929460] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2137.929626] env[67899]: INFO nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2137.931911] env[67899]: DEBUG nova.compute.claims [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2137.932097] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2137.932325] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2137.950898] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2138.007894] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5f9e7c63-b2c7-4b0d-a0e6-0d539d5735b4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2138.066801] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2138.067037] env[67899]: DEBUG oslo_vmware.rw_handles [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5f9e7c63-b2c7-4b0d-a0e6-0d539d5735b4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2138.143808] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-881cc23b-8308-4920-8032-145222b72ba3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2138.151316] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41839c68-53cb-48cb-bfe1-05e6ad7ef3db {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2138.180414] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62cdbd99-2137-409e-8853-9cd691bba1da {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2138.187167] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fea0af5d-e0fa-4c23-a086-7c12f3fc2fe0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2138.199830] env[67899]: DEBUG nova.compute.provider_tree [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2138.208104] env[67899]: DEBUG nova.scheduler.client.report [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2138.221539] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.289s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2138.222095] env[67899]: ERROR nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2138.222095] env[67899]: Faults: ['InvalidArgument'] [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Traceback (most recent call last): [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] self.driver.spawn(context, instance, image_meta, [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] self._fetch_image_if_missing(context, vi) [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] image_cache(vi, tmp_image_ds_loc) [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] vm_util.copy_virtual_disk( [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] session._wait_for_task(vmdk_copy_task) [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] return self.wait_for_task(task_ref) [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] return evt.wait() [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] result = hub.switch() [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] return self.greenlet.switch() [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] self.f(*self.args, **self.kw) [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] raise exceptions.translate_fault(task_info.error) [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Faults: ['InvalidArgument'] [ 2138.222095] env[67899]: ERROR nova.compute.manager [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] [ 2138.223125] env[67899]: DEBUG nova.compute.utils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2138.224159] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Build of instance e08f620d-63a0-45cb-99c6-d9d95c938b38 was re-scheduled: A specified parameter was not correct: fileType [ 2138.224159] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2138.224548] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2138.224725] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2138.224897] env[67899]: DEBUG nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2138.225092] env[67899]: DEBUG nova.network.neutron [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2138.532419] env[67899]: DEBUG nova.network.neutron [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2138.550666] env[67899]: INFO nova.compute.manager [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Took 0.33 seconds to deallocate network for instance. [ 2138.660067] env[67899]: INFO nova.scheduler.client.report [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Deleted allocations for instance e08f620d-63a0-45cb-99c6-d9d95c938b38 [ 2138.681158] env[67899]: DEBUG oslo_concurrency.lockutils [None req-be6d1b16-9fec-4200-8c0e-8930ce011ab6 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 518.449s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2138.681158] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 362.668s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2138.681158] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] During sync_power_state the instance has a pending task (spawning). Skip. [ 2138.681158] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2138.681158] env[67899]: DEBUG oslo_concurrency.lockutils [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 322.780s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2138.681158] env[67899]: DEBUG oslo_concurrency.lockutils [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Acquiring lock "e08f620d-63a0-45cb-99c6-d9d95c938b38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2138.681158] env[67899]: DEBUG oslo_concurrency.lockutils [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2138.681158] env[67899]: DEBUG oslo_concurrency.lockutils [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2138.683807] env[67899]: INFO nova.compute.manager [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Terminating instance [ 2138.686569] env[67899]: DEBUG nova.compute.manager [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2138.686936] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2138.687215] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-49788df3-dcbd-4374-b2a9-501f0679127d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2138.698800] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1de7469b-ab99-43e9-a29b-29219e8235b4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2138.728033] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e08f620d-63a0-45cb-99c6-d9d95c938b38 could not be found. [ 2138.728263] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2138.728442] env[67899]: INFO nova.compute.manager [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2138.728689] env[67899]: DEBUG oslo.service.loopingcall [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2138.728964] env[67899]: DEBUG nova.compute.manager [-] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2138.729094] env[67899]: DEBUG nova.network.neutron [-] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2138.757283] env[67899]: DEBUG nova.network.neutron [-] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2138.765141] env[67899]: INFO nova.compute.manager [-] [instance: e08f620d-63a0-45cb-99c6-d9d95c938b38] Took 0.04 seconds to deallocate network for instance. [ 2138.856580] env[67899]: DEBUG oslo_concurrency.lockutils [None req-35d58b2d-3e71-44d3-ba5e-8cd3f7388243 tempest-ServerMetadataTestJSON-29351844 tempest-ServerMetadataTestJSON-29351844-project-member] Lock "e08f620d-63a0-45cb-99c6-d9d95c938b38" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.176s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2139.995791] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2139.996046] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2139.996113] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2140.015219] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2140.015388] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2140.015523] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2140.015667] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2140.015814] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2140.015940] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 483824d1-4994-436a-ba16-12524684405c] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2140.016077] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2140.016200] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2140.016322] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2140.017163] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2140.017346] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2140.017505] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2140.027450] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2140.027662] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2140.027830] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2140.027976] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2140.029064] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97033b14-d769-4c2e-9e08-38cd9f5699a8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.038159] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c503ec4-7aa6-4fe6-91f6-f61ace17d892 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.052927] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab60a729-cdc0-4e22-9791-e25d826ea146 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.059293] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c42c0dbf-abbf-407f-a15e-1918ef7064bd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.087490] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180914MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2140.087630] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2140.087818] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2140.150665] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2140.150828] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2140.150959] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c4fe8b3e-cee1-401b-a26f-907a8de95eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2140.151101] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a993c6a9-140f-430d-a77e-98c2567bf7af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2140.151226] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2140.151349] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 483824d1-4994-436a-ba16-12524684405c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2140.151466] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cc1164c7-82bb-4d80-89ad-e9ba5658d9c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2140.151584] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance fc98bb10-8fe8-4203-80b8-9885b2c302c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2140.151764] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2140.151940] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2140.244080] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e3414f2-7847-4a77-95ba-06945a475c25 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.251806] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84cb775f-8f2e-4b87-a931-5b0e22cd903f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.282535] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8662c4ec-7827-4859-be28-7d3ca7870288 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.289289] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4b4d5ce-2896-4d19-9a50-1a0ed0795f98 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.302220] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2140.310226] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2140.323730] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2140.323904] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.236s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2141.303055] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2142.997124] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2142.997459] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2145.991664] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2187.394877] env[67899]: WARNING oslo_vmware.rw_handles [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2187.394877] env[67899]: ERROR oslo_vmware.rw_handles [ 2187.395635] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/5f9e7c63-b2c7-4b0d-a0e6-0d539d5735b4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2187.397700] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2187.397979] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Copying Virtual Disk [datastore1] vmware_temp/5f9e7c63-b2c7-4b0d-a0e6-0d539d5735b4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/5f9e7c63-b2c7-4b0d-a0e6-0d539d5735b4/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2187.398313] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6ef10f86-6c74-464c-8467-5a7590c1a25d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.407911] env[67899]: DEBUG oslo_vmware.api [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Waiting for the task: (returnval){ [ 2187.407911] env[67899]: value = "task-3468040" [ 2187.407911] env[67899]: _type = "Task" [ 2187.407911] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2187.415480] env[67899]: DEBUG oslo_vmware.api [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Task: {'id': task-3468040, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2187.918778] env[67899]: DEBUG oslo_vmware.exceptions [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2187.919093] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2187.919661] env[67899]: ERROR nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2187.919661] env[67899]: Faults: ['InvalidArgument'] [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Traceback (most recent call last): [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] yield resources [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] self.driver.spawn(context, instance, image_meta, [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] self._fetch_image_if_missing(context, vi) [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] image_cache(vi, tmp_image_ds_loc) [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] vm_util.copy_virtual_disk( [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] session._wait_for_task(vmdk_copy_task) [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] return self.wait_for_task(task_ref) [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] return evt.wait() [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] result = hub.switch() [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] return self.greenlet.switch() [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] self.f(*self.args, **self.kw) [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] raise exceptions.translate_fault(task_info.error) [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Faults: ['InvalidArgument'] [ 2187.919661] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] [ 2187.920575] env[67899]: INFO nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Terminating instance [ 2187.921512] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2187.921750] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2187.921970] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05c2f591-3c95-4dd8-8c8b-b6edad8af362 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.924206] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2187.924396] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2187.925144] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cde2966-1e1e-46aa-86c8-03e0b4427b78 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.932590] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2187.933681] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4b12179c-0b86-489b-b57d-d48a53571ff1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.935147] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2187.935353] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2187.936042] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-df489de1-0e7a-4abb-9db4-a34d826887d2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.941285] env[67899]: DEBUG oslo_vmware.api [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 2187.941285] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5299c81a-55f4-2b96-1e7d-10f131fcd867" [ 2187.941285] env[67899]: _type = "Task" [ 2187.941285] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2187.949934] env[67899]: DEBUG oslo_vmware.api [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5299c81a-55f4-2b96-1e7d-10f131fcd867, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2188.000507] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2188.000735] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2188.000913] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Deleting the datastore file [datastore1] 77ac61b9-48cc-4ae8-81e7-273841f7b42d {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2188.001212] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ec21a96a-f2bd-40ce-9cc2-540b29538418 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.008397] env[67899]: DEBUG oslo_vmware.api [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Waiting for the task: (returnval){ [ 2188.008397] env[67899]: value = "task-3468042" [ 2188.008397] env[67899]: _type = "Task" [ 2188.008397] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2188.016378] env[67899]: DEBUG oslo_vmware.api [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Task: {'id': task-3468042, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2188.451183] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2188.451556] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating directory with path [datastore1] vmware_temp/ef327526-1fb4-4c32-b68c-ccec9e846727/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2188.451639] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c68d216a-1ff8-4639-aeaf-c6d905a7f40c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.463855] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Created directory with path [datastore1] vmware_temp/ef327526-1fb4-4c32-b68c-ccec9e846727/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2188.464033] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Fetch image to [datastore1] vmware_temp/ef327526-1fb4-4c32-b68c-ccec9e846727/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2188.464206] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/ef327526-1fb4-4c32-b68c-ccec9e846727/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2188.464887] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5b7303f-2453-448f-a484-9da7dc213c6e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.471169] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55d2ee7b-9738-491f-a7de-57eb6f3cf581 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.479878] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed2286b1-3e38-40a1-829d-e07f8eb1be59 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.511834] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a86303f4-3556-4202-a50d-a50b21bbdf71 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.518647] env[67899]: DEBUG oslo_vmware.api [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Task: {'id': task-3468042, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073013} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2188.519992] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2188.520165] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2188.520336] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2188.520543] env[67899]: INFO nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2188.522245] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-73f1dea3-4251-421f-807f-ddc83aeb69f4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.524058] env[67899]: DEBUG nova.compute.claims [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2188.524230] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2188.524435] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2188.545474] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2188.595985] env[67899]: DEBUG oslo_vmware.rw_handles [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ef327526-1fb4-4c32-b68c-ccec9e846727/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2188.656802] env[67899]: DEBUG oslo_vmware.rw_handles [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2188.656991] env[67899]: DEBUG oslo_vmware.rw_handles [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ef327526-1fb4-4c32-b68c-ccec9e846727/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2188.725131] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f69abc5-6715-469c-920d-82227d365adf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.733423] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-269bc424-e8ec-43ae-b34a-1f3d3e78f271 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.764018] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8c55989-c205-448f-8bae-1e3b32a830f9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.771130] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e3e13ed-8169-4c5c-8ce2-da7ab77ff9c8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.785013] env[67899]: DEBUG nova.compute.provider_tree [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2188.792554] env[67899]: DEBUG nova.scheduler.client.report [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2188.805874] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.281s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2188.806564] env[67899]: ERROR nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2188.806564] env[67899]: Faults: ['InvalidArgument'] [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Traceback (most recent call last): [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] self.driver.spawn(context, instance, image_meta, [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] self._fetch_image_if_missing(context, vi) [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] image_cache(vi, tmp_image_ds_loc) [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] vm_util.copy_virtual_disk( [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] session._wait_for_task(vmdk_copy_task) [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] return self.wait_for_task(task_ref) [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] return evt.wait() [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] result = hub.switch() [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] return self.greenlet.switch() [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] self.f(*self.args, **self.kw) [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] raise exceptions.translate_fault(task_info.error) [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Faults: ['InvalidArgument'] [ 2188.806564] env[67899]: ERROR nova.compute.manager [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] [ 2188.807444] env[67899]: DEBUG nova.compute.utils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2188.808869] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Build of instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d was re-scheduled: A specified parameter was not correct: fileType [ 2188.808869] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2188.808992] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2188.809089] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2188.809275] env[67899]: DEBUG nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2188.809438] env[67899]: DEBUG nova.network.neutron [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2189.338236] env[67899]: DEBUG nova.network.neutron [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2189.349211] env[67899]: INFO nova.compute.manager [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Took 0.54 seconds to deallocate network for instance. [ 2189.441122] env[67899]: INFO nova.scheduler.client.report [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Deleted allocations for instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d [ 2189.461021] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a0855fe1-c9c8-4cb7-b7b5-b5ff6b519a81 tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 508.762s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2189.461302] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 413.450s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2189.461475] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] During sync_power_state the instance has a pending task (spawning). Skip. [ 2189.461642] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2189.461856] env[67899]: DEBUG oslo_concurrency.lockutils [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 312.565s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2189.462088] env[67899]: DEBUG oslo_concurrency.lockutils [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2189.462287] env[67899]: DEBUG oslo_concurrency.lockutils [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2189.462451] env[67899]: DEBUG oslo_concurrency.lockutils [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2189.464266] env[67899]: INFO nova.compute.manager [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Terminating instance [ 2189.465915] env[67899]: DEBUG nova.compute.manager [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2189.466131] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2189.466625] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ff5609b8-fee3-442a-a1cd-183658ed1fa2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2189.475956] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b699428-efe9-4dd9-80b5-39deac63b669 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2189.502642] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 77ac61b9-48cc-4ae8-81e7-273841f7b42d could not be found. [ 2189.502859] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2189.503056] env[67899]: INFO nova.compute.manager [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2189.503292] env[67899]: DEBUG oslo.service.loopingcall [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2189.503512] env[67899]: DEBUG nova.compute.manager [-] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2189.503611] env[67899]: DEBUG nova.network.neutron [-] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2189.526054] env[67899]: DEBUG nova.network.neutron [-] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2189.533684] env[67899]: INFO nova.compute.manager [-] [instance: 77ac61b9-48cc-4ae8-81e7-273841f7b42d] Took 0.03 seconds to deallocate network for instance. [ 2189.618652] env[67899]: DEBUG oslo_concurrency.lockutils [None req-dfc548e0-a8e9-4135-a435-e07a569b6f5c tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Lock "77ac61b9-48cc-4ae8-81e7-273841f7b42d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.157s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2193.011790] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2196.996674] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2196.996674] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2199.997651] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2199.997651] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2200.008120] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2200.008340] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2200.008505] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2200.008661] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2200.009753] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6a3fc25-d3a3-42a2-bce7-d7c33b5cd8a5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2200.018638] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb75aa2f-40c4-4c00-b31f-e35b0a6bca17 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2200.032382] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f2eb090-ea9a-4ec3-8b04-1b742f05b61b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2200.038594] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d113328-81b4-42e8-8a04-66de2d47a099 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2200.066951] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2200.067108] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2200.067299] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2200.124151] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2200.124325] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c4fe8b3e-cee1-401b-a26f-907a8de95eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2200.124453] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a993c6a9-140f-430d-a77e-98c2567bf7af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2200.124575] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2200.124696] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 483824d1-4994-436a-ba16-12524684405c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2200.124813] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cc1164c7-82bb-4d80-89ad-e9ba5658d9c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2200.124931] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance fc98bb10-8fe8-4203-80b8-9885b2c302c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2200.125130] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2200.125273] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2200.209176] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1eea2c4-b2c8-4e14-b25c-996fdc16d64d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2200.216747] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07efe5b9-1f27-4549-b7c2-846b75a1c823 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2200.247702] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a33ec4b4-ea43-4a1c-958e-01d310e43fac {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2200.254845] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-208522ec-3da0-42d2-96a6-5aea5f5e6da5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2200.267821] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2200.275846] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2200.288906] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2200.289091] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2201.289101] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2201.289418] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2201.289418] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2201.305541] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2201.305707] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2201.305842] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2201.305968] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2201.306103] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 483824d1-4994-436a-ba16-12524684405c] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2201.306222] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2201.306337] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2201.306457] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2201.306956] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2201.996563] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2204.997800] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2204.998194] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2213.607919] env[67899]: DEBUG oslo_concurrency.lockutils [None req-01020d9f-000c-4188-81f0-482a0d893bcd tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2221.928126] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2221.928510] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2221.938734] env[67899]: DEBUG nova.compute.manager [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2221.985223] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2221.985479] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2221.986985] env[67899]: INFO nova.compute.claims [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2222.125896] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c81e7d05-3d50-44e9-acaf-ec73138b051a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2222.133505] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9785a890-23fa-435c-bebd-3edd2c5c23de {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2222.162277] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2004ce38-35ae-4a5d-847a-f446b05896d9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2222.169192] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96886965-d153-40bd-b05a-bc1d176b6882 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2222.181734] env[67899]: DEBUG nova.compute.provider_tree [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2222.191392] env[67899]: DEBUG nova.scheduler.client.report [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2222.207086] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2222.207546] env[67899]: DEBUG nova.compute.manager [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2222.239937] env[67899]: DEBUG nova.compute.utils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2222.240930] env[67899]: DEBUG nova.compute.manager [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2222.241169] env[67899]: DEBUG nova.network.neutron [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2222.251437] env[67899]: DEBUG nova.compute.manager [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2222.314182] env[67899]: DEBUG nova.compute.manager [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2222.325053] env[67899]: DEBUG nova.policy [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5206226ca404a07b10db199a6436504', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bdf895619b34412fb20488318e170d23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 2222.338840] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2222.339074] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2222.339236] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2222.339415] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2222.339563] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2222.339708] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2222.339911] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2222.340084] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2222.340255] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2222.340414] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2222.340583] env[67899]: DEBUG nova.virt.hardware [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2222.341416] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a432934-47ab-4ea6-a737-eac78e6be0c8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2222.348984] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26fabd1a-5799-46bb-b836-1d469e31fe82 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2222.612660] env[67899]: DEBUG nova.network.neutron [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Successfully created port: d1d11df6-6a15-4cbf-ba86-2fbc531751ea {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2223.241761] env[67899]: DEBUG nova.compute.manager [req-8f045f48-bb19-4277-86e3-bdb2699bbd4a req-ea6c2653-5560-49eb-8646-8f7e09fa47e8 service nova] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Received event network-vif-plugged-d1d11df6-6a15-4cbf-ba86-2fbc531751ea {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2223.242022] env[67899]: DEBUG oslo_concurrency.lockutils [req-8f045f48-bb19-4277-86e3-bdb2699bbd4a req-ea6c2653-5560-49eb-8646-8f7e09fa47e8 service nova] Acquiring lock "4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2223.242191] env[67899]: DEBUG oslo_concurrency.lockutils [req-8f045f48-bb19-4277-86e3-bdb2699bbd4a req-ea6c2653-5560-49eb-8646-8f7e09fa47e8 service nova] Lock "4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2223.242354] env[67899]: DEBUG oslo_concurrency.lockutils [req-8f045f48-bb19-4277-86e3-bdb2699bbd4a req-ea6c2653-5560-49eb-8646-8f7e09fa47e8 service nova] Lock "4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2223.242517] env[67899]: DEBUG nova.compute.manager [req-8f045f48-bb19-4277-86e3-bdb2699bbd4a req-ea6c2653-5560-49eb-8646-8f7e09fa47e8 service nova] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] No waiting events found dispatching network-vif-plugged-d1d11df6-6a15-4cbf-ba86-2fbc531751ea {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2223.242681] env[67899]: WARNING nova.compute.manager [req-8f045f48-bb19-4277-86e3-bdb2699bbd4a req-ea6c2653-5560-49eb-8646-8f7e09fa47e8 service nova] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Received unexpected event network-vif-plugged-d1d11df6-6a15-4cbf-ba86-2fbc531751ea for instance with vm_state building and task_state spawning. [ 2223.294016] env[67899]: DEBUG nova.network.neutron [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Successfully updated port: d1d11df6-6a15-4cbf-ba86-2fbc531751ea {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2223.305750] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "refresh_cache-4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2223.305909] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "refresh_cache-4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2223.306071] env[67899]: DEBUG nova.network.neutron [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2223.361545] env[67899]: DEBUG nova.network.neutron [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2223.531712] env[67899]: DEBUG nova.network.neutron [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Updating instance_info_cache with network_info: [{"id": "d1d11df6-6a15-4cbf-ba86-2fbc531751ea", "address": "fa:16:3e:b9:91:bf", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1d11df6-6a", "ovs_interfaceid": "d1d11df6-6a15-4cbf-ba86-2fbc531751ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2223.542498] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "refresh_cache-4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2223.542804] env[67899]: DEBUG nova.compute.manager [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Instance network_info: |[{"id": "d1d11df6-6a15-4cbf-ba86-2fbc531751ea", "address": "fa:16:3e:b9:91:bf", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1d11df6-6a", "ovs_interfaceid": "d1d11df6-6a15-4cbf-ba86-2fbc531751ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2223.543229] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b9:91:bf', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '357d2811-e990-4985-9f9e-b158d10d3699', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd1d11df6-6a15-4cbf-ba86-2fbc531751ea', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2223.550804] env[67899]: DEBUG oslo.service.loopingcall [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2223.551302] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2223.551530] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ee5cd94a-107d-4545-a817-e2c23b1803ff {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2223.572029] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2223.572029] env[67899]: value = "task-3468043" [ 2223.572029] env[67899]: _type = "Task" [ 2223.572029] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2223.579558] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468043, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2224.082431] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468043, 'name': CreateVM_Task, 'duration_secs': 0.295144} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2224.082654] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2224.083355] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2224.083565] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2224.083954] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2224.084246] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cea47083-a951-4e03-a330-935b234f8686 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2224.088375] env[67899]: DEBUG oslo_vmware.api [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 2224.088375] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52a5b32a-cb0a-a241-305d-1c9d2e75083d" [ 2224.088375] env[67899]: _type = "Task" [ 2224.088375] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2224.095353] env[67899]: DEBUG oslo_vmware.api [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52a5b32a-cb0a-a241-305d-1c9d2e75083d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2224.598912] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2224.599253] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2224.599379] env[67899]: DEBUG oslo_concurrency.lockutils [None req-ad83b99e-2725-4c71-9c91-efc9cdef2ced tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2225.285598] env[67899]: DEBUG nova.compute.manager [req-92fa787d-203e-4466-9061-a430f7d931e1 req-4648929d-2c5f-44b3-8267-ea9091433fc2 service nova] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Received event network-changed-d1d11df6-6a15-4cbf-ba86-2fbc531751ea {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2225.285810] env[67899]: DEBUG nova.compute.manager [req-92fa787d-203e-4466-9061-a430f7d931e1 req-4648929d-2c5f-44b3-8267-ea9091433fc2 service nova] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Refreshing instance network info cache due to event network-changed-d1d11df6-6a15-4cbf-ba86-2fbc531751ea. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2225.286066] env[67899]: DEBUG oslo_concurrency.lockutils [req-92fa787d-203e-4466-9061-a430f7d931e1 req-4648929d-2c5f-44b3-8267-ea9091433fc2 service nova] Acquiring lock "refresh_cache-4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2225.286215] env[67899]: DEBUG oslo_concurrency.lockutils [req-92fa787d-203e-4466-9061-a430f7d931e1 req-4648929d-2c5f-44b3-8267-ea9091433fc2 service nova] Acquired lock "refresh_cache-4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2225.286372] env[67899]: DEBUG nova.network.neutron [req-92fa787d-203e-4466-9061-a430f7d931e1 req-4648929d-2c5f-44b3-8267-ea9091433fc2 service nova] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Refreshing network info cache for port d1d11df6-6a15-4cbf-ba86-2fbc531751ea {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2225.554162] env[67899]: DEBUG nova.network.neutron [req-92fa787d-203e-4466-9061-a430f7d931e1 req-4648929d-2c5f-44b3-8267-ea9091433fc2 service nova] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Updated VIF entry in instance network info cache for port d1d11df6-6a15-4cbf-ba86-2fbc531751ea. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2225.554498] env[67899]: DEBUG nova.network.neutron [req-92fa787d-203e-4466-9061-a430f7d931e1 req-4648929d-2c5f-44b3-8267-ea9091433fc2 service nova] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Updating instance_info_cache with network_info: [{"id": "d1d11df6-6a15-4cbf-ba86-2fbc531751ea", "address": "fa:16:3e:b9:91:bf", "network": {"id": "8255316b-777c-4721-b67b-7f6fdf4bd351", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2146531806-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bdf895619b34412fb20488318e170d23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "357d2811-e990-4985-9f9e-b158d10d3699", "external-id": "nsx-vlan-transportzone-641", "segmentation_id": 641, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1d11df6-6a", "ovs_interfaceid": "d1d11df6-6a15-4cbf-ba86-2fbc531751ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2225.565083] env[67899]: DEBUG oslo_concurrency.lockutils [req-92fa787d-203e-4466-9061-a430f7d931e1 req-4648929d-2c5f-44b3-8267-ea9091433fc2 service nova] Releasing lock "refresh_cache-4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2238.574467] env[67899]: WARNING oslo_vmware.rw_handles [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2238.574467] env[67899]: ERROR oslo_vmware.rw_handles [ 2238.575012] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/ef327526-1fb4-4c32-b68c-ccec9e846727/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2238.576965] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2238.577254] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Copying Virtual Disk [datastore1] vmware_temp/ef327526-1fb4-4c32-b68c-ccec9e846727/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/ef327526-1fb4-4c32-b68c-ccec9e846727/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2238.577544] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cb3a0e18-6474-4801-851f-46bd6924b065 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2238.586333] env[67899]: DEBUG oslo_vmware.api [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 2238.586333] env[67899]: value = "task-3468044" [ 2238.586333] env[67899]: _type = "Task" [ 2238.586333] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2238.594483] env[67899]: DEBUG oslo_vmware.api [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': task-3468044, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2239.097072] env[67899]: DEBUG oslo_vmware.exceptions [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2239.097367] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2239.097894] env[67899]: ERROR nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2239.097894] env[67899]: Faults: ['InvalidArgument'] [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Traceback (most recent call last): [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] yield resources [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] self.driver.spawn(context, instance, image_meta, [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] self._fetch_image_if_missing(context, vi) [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] image_cache(vi, tmp_image_ds_loc) [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] vm_util.copy_virtual_disk( [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] session._wait_for_task(vmdk_copy_task) [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] return self.wait_for_task(task_ref) [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] return evt.wait() [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] result = hub.switch() [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] return self.greenlet.switch() [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] self.f(*self.args, **self.kw) [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] raise exceptions.translate_fault(task_info.error) [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Faults: ['InvalidArgument'] [ 2239.097894] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] [ 2239.098794] env[67899]: INFO nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Terminating instance [ 2239.099707] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2239.099922] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2239.100189] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e98015df-429a-4cd1-88c1-37ce5142f111 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.103016] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2239.103230] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2239.103952] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a74f3b60-22f2-4074-9dc6-0087a9acffcd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.110670] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2239.110888] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3d6dc7bf-ab7c-403e-a8f8-6878bd57b385 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.113273] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2239.113434] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2239.114387] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a5748dcf-27a5-4430-a615-dda1ce102ad3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.119127] env[67899]: DEBUG oslo_vmware.api [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Waiting for the task: (returnval){ [ 2239.119127] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]526cffa3-6ba9-88d0-3de7-0797274cdb64" [ 2239.119127] env[67899]: _type = "Task" [ 2239.119127] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2239.126701] env[67899]: DEBUG oslo_vmware.api [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]526cffa3-6ba9-88d0-3de7-0797274cdb64, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2239.179621] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2239.179927] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2239.180067] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Deleting the datastore file [datastore1] a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2239.180359] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a769bf33-07a3-445a-894a-e8a0b12c79d8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.186760] env[67899]: DEBUG oslo_vmware.api [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for the task: (returnval){ [ 2239.186760] env[67899]: value = "task-3468046" [ 2239.186760] env[67899]: _type = "Task" [ 2239.186760] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2239.195184] env[67899]: DEBUG oslo_vmware.api [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': task-3468046, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2239.629323] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2239.629619] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Creating directory with path [datastore1] vmware_temp/1c7c0db9-513a-4463-87cf-2085f5687094/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2239.629822] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3dc3b3af-a4e3-4571-8cfc-dd07d320b732 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.642262] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Created directory with path [datastore1] vmware_temp/1c7c0db9-513a-4463-87cf-2085f5687094/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2239.642461] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Fetch image to [datastore1] vmware_temp/1c7c0db9-513a-4463-87cf-2085f5687094/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2239.642635] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/1c7c0db9-513a-4463-87cf-2085f5687094/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2239.643429] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47a62154-1f45-41bb-887f-a7f6a5a17c97 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.650155] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-217e2491-7447-4256-802a-7c40fbd2d20f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.659587] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bef687b5-43a5-40a8-a7cc-d797a69c6853 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.693600] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eba10dc7-b6af-4856-a3f1-d78ef40127d0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.701018] env[67899]: DEBUG oslo_vmware.api [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Task: {'id': task-3468046, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071215} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2239.702601] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2239.702798] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2239.702971] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2239.703160] env[67899]: INFO nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2239.704923] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b89e3fa0-596c-4b7a-922a-04fe2f8b3a87 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.706875] env[67899]: DEBUG nova.compute.claims [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2239.707068] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2239.707273] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2239.729038] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2239.788362] env[67899]: DEBUG oslo_vmware.rw_handles [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1c7c0db9-513a-4463-87cf-2085f5687094/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2239.848653] env[67899]: DEBUG oslo_vmware.rw_handles [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2239.848845] env[67899]: DEBUG oslo_vmware.rw_handles [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1c7c0db9-513a-4463-87cf-2085f5687094/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2239.910664] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70b135ee-9106-4f3e-b257-4f6dfd9fe673 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.918193] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d5cc2cb-2e1a-4cec-896f-b51c3c69423b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.962566] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ac96913-1d4b-4406-b3cc-208f81b9ad6d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.973140] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3fc2655-754f-42dd-ac0d-7a84425db275 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2239.996016] env[67899]: DEBUG nova.compute.provider_tree [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2240.005546] env[67899]: DEBUG nova.scheduler.client.report [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2240.020957] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.314s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2240.021504] env[67899]: ERROR nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2240.021504] env[67899]: Faults: ['InvalidArgument'] [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Traceback (most recent call last): [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] self.driver.spawn(context, instance, image_meta, [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] self._fetch_image_if_missing(context, vi) [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] image_cache(vi, tmp_image_ds_loc) [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] vm_util.copy_virtual_disk( [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] session._wait_for_task(vmdk_copy_task) [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] return self.wait_for_task(task_ref) [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] return evt.wait() [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] result = hub.switch() [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] return self.greenlet.switch() [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] self.f(*self.args, **self.kw) [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] raise exceptions.translate_fault(task_info.error) [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Faults: ['InvalidArgument'] [ 2240.021504] env[67899]: ERROR nova.compute.manager [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] [ 2240.022447] env[67899]: DEBUG nova.compute.utils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2240.023580] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Build of instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 was re-scheduled: A specified parameter was not correct: fileType [ 2240.023580] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2240.023968] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2240.024174] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2240.024353] env[67899]: DEBUG nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2240.024513] env[67899]: DEBUG nova.network.neutron [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2240.335030] env[67899]: DEBUG nova.network.neutron [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2240.348110] env[67899]: INFO nova.compute.manager [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Took 0.32 seconds to deallocate network for instance. [ 2240.436340] env[67899]: INFO nova.scheduler.client.report [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Deleted allocations for instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 [ 2240.458137] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2401ba2f-a600-4d1b-93b2-2a833ea4d05b tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 411.626s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2240.458498] env[67899]: DEBUG oslo_concurrency.lockutils [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 215.165s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2240.458943] env[67899]: DEBUG oslo_concurrency.lockutils [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2240.459039] env[67899]: DEBUG oslo_concurrency.lockutils [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2240.459251] env[67899]: DEBUG oslo_concurrency.lockutils [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2240.461426] env[67899]: INFO nova.compute.manager [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Terminating instance [ 2240.463253] env[67899]: DEBUG nova.compute.manager [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2240.463333] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2240.464022] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-add330aa-d816-4d9c-976c-42bfc62814c0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.474552] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f01081d-febf-4fc4-b186-1b7dc71a2d01 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2240.502770] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a9ef96da-fcfd-4fb5-bbb1-5178111a8a62 could not be found. [ 2240.502970] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2240.503161] env[67899]: INFO nova.compute.manager [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2240.503402] env[67899]: DEBUG oslo.service.loopingcall [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2240.503618] env[67899]: DEBUG nova.compute.manager [-] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2240.503714] env[67899]: DEBUG nova.network.neutron [-] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2240.526554] env[67899]: DEBUG nova.network.neutron [-] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2240.534180] env[67899]: INFO nova.compute.manager [-] [instance: a9ef96da-fcfd-4fb5-bbb1-5178111a8a62] Took 0.03 seconds to deallocate network for instance. [ 2240.619916] env[67899]: DEBUG oslo_concurrency.lockutils [None req-fddee4da-03cd-43a9-a297-fa411b297eed tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Lock "a9ef96da-fcfd-4fb5-bbb1-5178111a8a62" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.161s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2245.991309] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "43854021-a115-4460-870a-d7332c62b758" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2245.991617] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "43854021-a115-4460-870a-d7332c62b758" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2246.001298] env[67899]: DEBUG nova.compute.manager [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2246.046036] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2246.046036] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2246.047678] env[67899]: INFO nova.compute.claims [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2246.184146] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f36acd8-6d9f-4097-b570-399758abc43c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2246.191946] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eacdbca9-cefb-4d9e-9649-47a9de97ce70 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2246.220529] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b612393b-426f-4fc2-bf74-3265144b8bee {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2246.227192] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78ab7516-aa4c-4d67-8fd3-c8af13a5bfc2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2246.239522] env[67899]: DEBUG nova.compute.provider_tree [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2246.247886] env[67899]: DEBUG nova.scheduler.client.report [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2246.261603] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2246.262064] env[67899]: DEBUG nova.compute.manager [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2246.293566] env[67899]: DEBUG nova.compute.utils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2246.294691] env[67899]: DEBUG nova.compute.manager [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2246.294855] env[67899]: DEBUG nova.network.neutron [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2246.302529] env[67899]: DEBUG nova.compute.manager [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2246.349410] env[67899]: DEBUG nova.policy [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ae302ed41614521a1a97b4c607a9eee', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a918aafa0191456bba21e2a0fda8d3c9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 2246.364104] env[67899]: DEBUG nova.compute.manager [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2246.388457] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2246.388457] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2246.388457] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2246.388608] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2246.388717] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2246.388854] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2246.389065] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2246.389229] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2246.389388] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2246.389540] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2246.389700] env[67899]: DEBUG nova.virt.hardware [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2246.390565] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-126edab5-c1e9-44c7-b734-4287f721e356 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2246.398026] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d231998-4722-4f8c-922c-dd1e4affbab6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2246.746435] env[67899]: DEBUG nova.network.neutron [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Successfully created port: 81e141f0-fc77-49a3-92cc-639c9680116b {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2247.388102] env[67899]: DEBUG nova.compute.manager [req-14069a3e-ee86-4ef4-b455-7b07888ebf59 req-14afab4b-25e7-4539-b359-e0cc67e3712c service nova] [instance: 43854021-a115-4460-870a-d7332c62b758] Received event network-vif-plugged-81e141f0-fc77-49a3-92cc-639c9680116b {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2247.388342] env[67899]: DEBUG oslo_concurrency.lockutils [req-14069a3e-ee86-4ef4-b455-7b07888ebf59 req-14afab4b-25e7-4539-b359-e0cc67e3712c service nova] Acquiring lock "43854021-a115-4460-870a-d7332c62b758-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2247.388548] env[67899]: DEBUG oslo_concurrency.lockutils [req-14069a3e-ee86-4ef4-b455-7b07888ebf59 req-14afab4b-25e7-4539-b359-e0cc67e3712c service nova] Lock "43854021-a115-4460-870a-d7332c62b758-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2247.388711] env[67899]: DEBUG oslo_concurrency.lockutils [req-14069a3e-ee86-4ef4-b455-7b07888ebf59 req-14afab4b-25e7-4539-b359-e0cc67e3712c service nova] Lock "43854021-a115-4460-870a-d7332c62b758-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2247.388874] env[67899]: DEBUG nova.compute.manager [req-14069a3e-ee86-4ef4-b455-7b07888ebf59 req-14afab4b-25e7-4539-b359-e0cc67e3712c service nova] [instance: 43854021-a115-4460-870a-d7332c62b758] No waiting events found dispatching network-vif-plugged-81e141f0-fc77-49a3-92cc-639c9680116b {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2247.389056] env[67899]: WARNING nova.compute.manager [req-14069a3e-ee86-4ef4-b455-7b07888ebf59 req-14afab4b-25e7-4539-b359-e0cc67e3712c service nova] [instance: 43854021-a115-4460-870a-d7332c62b758] Received unexpected event network-vif-plugged-81e141f0-fc77-49a3-92cc-639c9680116b for instance with vm_state building and task_state spawning. [ 2247.472996] env[67899]: DEBUG nova.network.neutron [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Successfully updated port: 81e141f0-fc77-49a3-92cc-639c9680116b {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2247.483479] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "refresh_cache-43854021-a115-4460-870a-d7332c62b758" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2247.483685] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquired lock "refresh_cache-43854021-a115-4460-870a-d7332c62b758" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2247.483845] env[67899]: DEBUG nova.network.neutron [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2247.536310] env[67899]: DEBUG nova.network.neutron [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2247.932807] env[67899]: DEBUG nova.network.neutron [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Updating instance_info_cache with network_info: [{"id": "81e141f0-fc77-49a3-92cc-639c9680116b", "address": "fa:16:3e:53:dc:46", "network": {"id": "6b50a822-7305-45f0-bf1e-da3ad38b5edb", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1869812463-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a918aafa0191456bba21e2a0fda8d3c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9f208df-1fb5-4403-9796-7fd19e4bfb85", "external-id": "cl2-zone-400", "segmentation_id": 400, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81e141f0-fc", "ovs_interfaceid": "81e141f0-fc77-49a3-92cc-639c9680116b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2247.943707] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Releasing lock "refresh_cache-43854021-a115-4460-870a-d7332c62b758" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2247.943978] env[67899]: DEBUG nova.compute.manager [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Instance network_info: |[{"id": "81e141f0-fc77-49a3-92cc-639c9680116b", "address": "fa:16:3e:53:dc:46", "network": {"id": "6b50a822-7305-45f0-bf1e-da3ad38b5edb", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1869812463-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a918aafa0191456bba21e2a0fda8d3c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9f208df-1fb5-4403-9796-7fd19e4bfb85", "external-id": "cl2-zone-400", "segmentation_id": 400, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81e141f0-fc", "ovs_interfaceid": "81e141f0-fc77-49a3-92cc-639c9680116b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2247.944400] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:53:dc:46', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c9f208df-1fb5-4403-9796-7fd19e4bfb85', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '81e141f0-fc77-49a3-92cc-639c9680116b', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2247.951857] env[67899]: DEBUG oslo.service.loopingcall [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2247.952312] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 43854021-a115-4460-870a-d7332c62b758] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2247.952906] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ca84ef8c-eb04-42e5-a80d-ba8556a6f287 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.972724] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2247.972724] env[67899]: value = "task-3468047" [ 2247.972724] env[67899]: _type = "Task" [ 2247.972724] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2247.980238] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468047, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2248.482998] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468047, 'name': CreateVM_Task, 'duration_secs': 0.316393} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2248.483417] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 43854021-a115-4460-870a-d7332c62b758] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2248.483861] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2248.484080] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2248.484371] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2248.484617] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-10fca4a5-bdfb-4a24-86ec-ed0d542ec71d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2248.488813] env[67899]: DEBUG oslo_vmware.api [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Waiting for the task: (returnval){ [ 2248.488813] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]520d4d70-e7a4-f352-9392-caa4ae329180" [ 2248.488813] env[67899]: _type = "Task" [ 2248.488813] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2248.496742] env[67899]: DEBUG oslo_vmware.api [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]520d4d70-e7a4-f352-9392-caa4ae329180, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2249.000102] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2249.000379] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 43854021-a115-4460-870a-d7332c62b758] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2249.000592] env[67899]: DEBUG oslo_concurrency.lockutils [None req-aa24cafe-d3f9-45e0-977e-2764009e8fc6 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2249.418551] env[67899]: DEBUG nova.compute.manager [req-c807befd-5cef-40d6-9eb1-ddf840fb0f86 req-8869a464-4159-47d3-8e63-5f5969a6214c service nova] [instance: 43854021-a115-4460-870a-d7332c62b758] Received event network-changed-81e141f0-fc77-49a3-92cc-639c9680116b {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2249.418660] env[67899]: DEBUG nova.compute.manager [req-c807befd-5cef-40d6-9eb1-ddf840fb0f86 req-8869a464-4159-47d3-8e63-5f5969a6214c service nova] [instance: 43854021-a115-4460-870a-d7332c62b758] Refreshing instance network info cache due to event network-changed-81e141f0-fc77-49a3-92cc-639c9680116b. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2249.418872] env[67899]: DEBUG oslo_concurrency.lockutils [req-c807befd-5cef-40d6-9eb1-ddf840fb0f86 req-8869a464-4159-47d3-8e63-5f5969a6214c service nova] Acquiring lock "refresh_cache-43854021-a115-4460-870a-d7332c62b758" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2249.419034] env[67899]: DEBUG oslo_concurrency.lockutils [req-c807befd-5cef-40d6-9eb1-ddf840fb0f86 req-8869a464-4159-47d3-8e63-5f5969a6214c service nova] Acquired lock "refresh_cache-43854021-a115-4460-870a-d7332c62b758" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2249.419205] env[67899]: DEBUG nova.network.neutron [req-c807befd-5cef-40d6-9eb1-ddf840fb0f86 req-8869a464-4159-47d3-8e63-5f5969a6214c service nova] [instance: 43854021-a115-4460-870a-d7332c62b758] Refreshing network info cache for port 81e141f0-fc77-49a3-92cc-639c9680116b {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2249.646583] env[67899]: DEBUG nova.network.neutron [req-c807befd-5cef-40d6-9eb1-ddf840fb0f86 req-8869a464-4159-47d3-8e63-5f5969a6214c service nova] [instance: 43854021-a115-4460-870a-d7332c62b758] Updated VIF entry in instance network info cache for port 81e141f0-fc77-49a3-92cc-639c9680116b. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2249.646960] env[67899]: DEBUG nova.network.neutron [req-c807befd-5cef-40d6-9eb1-ddf840fb0f86 req-8869a464-4159-47d3-8e63-5f5969a6214c service nova] [instance: 43854021-a115-4460-870a-d7332c62b758] Updating instance_info_cache with network_info: [{"id": "81e141f0-fc77-49a3-92cc-639c9680116b", "address": "fa:16:3e:53:dc:46", "network": {"id": "6b50a822-7305-45f0-bf1e-da3ad38b5edb", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1869812463-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a918aafa0191456bba21e2a0fda8d3c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c9f208df-1fb5-4403-9796-7fd19e4bfb85", "external-id": "cl2-zone-400", "segmentation_id": 400, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81e141f0-fc", "ovs_interfaceid": "81e141f0-fc77-49a3-92cc-639c9680116b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2249.656136] env[67899]: DEBUG oslo_concurrency.lockutils [req-c807befd-5cef-40d6-9eb1-ddf840fb0f86 req-8869a464-4159-47d3-8e63-5f5969a6214c service nova] Releasing lock "refresh_cache-43854021-a115-4460-870a-d7332c62b758" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2253.992150] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2256.997541] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2257.996015] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2259.997616] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2260.008917] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2260.009133] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2260.009298] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2260.009449] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2260.010843] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baeb21ec-1ed6-4498-aa7f-292bfc5fa202 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2260.019347] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-573c3ea9-82f3-48c6-a883-4891287ccb7c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2260.033229] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1eed367b-60a5-4162-ad38-46f72bd84130 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2260.039592] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecc0df3b-5582-4c53-8998-27d483cf6e76 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2260.067856] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180920MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2260.068027] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2260.068199] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2260.133494] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c4fe8b3e-cee1-401b-a26f-907a8de95eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2260.133658] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a993c6a9-140f-430d-a77e-98c2567bf7af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2260.133784] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2260.133907] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 483824d1-4994-436a-ba16-12524684405c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2260.134036] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cc1164c7-82bb-4d80-89ad-e9ba5658d9c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2260.134156] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance fc98bb10-8fe8-4203-80b8-9885b2c302c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2260.134269] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2260.134383] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 43854021-a115-4460-870a-d7332c62b758 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2260.134560] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2260.134697] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2260.234709] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23bc3852-6400-4816-bdc7-c1bf7ca6f51c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2260.242631] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e9c870-2409-4de6-949a-dae02c7c54ed {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2260.273721] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a227b4f7-551a-4702-8b94-5502f44acf52 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2260.280698] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33b34b68-0348-4366-8164-a3b55dfa2efc {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2260.293512] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2260.302060] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2260.314780] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2260.314961] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2261.313472] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2261.997475] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2262.996784] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2262.997160] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2262.997160] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2263.014716] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2263.014862] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2263.014991] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2263.015134] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 483824d1-4994-436a-ba16-12524684405c] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2263.015265] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2263.015388] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2263.015507] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2263.015621] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 43854021-a115-4460-870a-d7332c62b758] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2263.015735] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2263.016190] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2264.996667] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2264.997038] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2267.992342] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2269.964222] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7d31d5bc-4934-42f1-98db-39fe8ba5912e tempest-ServersTestJSON-400587867 tempest-ServersTestJSON-400587867-project-member] Acquiring lock "fc98bb10-8fe8-4203-80b8-9885b2c302c1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2288.593649] env[67899]: WARNING oslo_vmware.rw_handles [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2288.593649] env[67899]: ERROR oslo_vmware.rw_handles [ 2288.594403] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/1c7c0db9-513a-4463-87cf-2085f5687094/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2288.596404] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2288.596693] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Copying Virtual Disk [datastore1] vmware_temp/1c7c0db9-513a-4463-87cf-2085f5687094/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/1c7c0db9-513a-4463-87cf-2085f5687094/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2288.597056] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2354877c-27e4-454d-830d-61468fd91a1e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2288.604185] env[67899]: DEBUG oslo_vmware.api [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Waiting for the task: (returnval){ [ 2288.604185] env[67899]: value = "task-3468048" [ 2288.604185] env[67899]: _type = "Task" [ 2288.604185] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2288.611905] env[67899]: DEBUG oslo_vmware.api [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Task: {'id': task-3468048, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2289.115437] env[67899]: DEBUG oslo_vmware.exceptions [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2289.115780] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2289.116382] env[67899]: ERROR nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2289.116382] env[67899]: Faults: ['InvalidArgument'] [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Traceback (most recent call last): [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] yield resources [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] self.driver.spawn(context, instance, image_meta, [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] self._fetch_image_if_missing(context, vi) [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] image_cache(vi, tmp_image_ds_loc) [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] vm_util.copy_virtual_disk( [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] session._wait_for_task(vmdk_copy_task) [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] return self.wait_for_task(task_ref) [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] return evt.wait() [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] result = hub.switch() [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] return self.greenlet.switch() [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] self.f(*self.args, **self.kw) [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] raise exceptions.translate_fault(task_info.error) [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Faults: ['InvalidArgument'] [ 2289.116382] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] [ 2289.117373] env[67899]: INFO nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Terminating instance [ 2289.118543] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2289.118787] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2289.119069] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3a7e9aba-b61f-4135-9d13-110be2b7007a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.121117] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "refresh_cache-c4fe8b3e-cee1-401b-a26f-907a8de95eba" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2289.121278] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquired lock "refresh_cache-c4fe8b3e-cee1-401b-a26f-907a8de95eba" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2289.121441] env[67899]: DEBUG nova.network.neutron [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2289.128024] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2289.128123] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2289.129234] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d8518188-6b40-4077-ad51-5526f14ef2ad {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.136269] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 2289.136269] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5273eab7-5542-9e99-8c78-25c1553c56fe" [ 2289.136269] env[67899]: _type = "Task" [ 2289.136269] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2289.144601] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5273eab7-5542-9e99-8c78-25c1553c56fe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2289.149835] env[67899]: DEBUG nova.network.neutron [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2289.208909] env[67899]: DEBUG nova.network.neutron [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2289.217186] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Releasing lock "refresh_cache-c4fe8b3e-cee1-401b-a26f-907a8de95eba" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2289.217561] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2289.217761] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2289.218820] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4664c8b3-fabb-49d0-8425-4ffbea83eb47 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.226343] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2289.226564] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e34eafc0-81b2-4afe-bded-d000faa7f84a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.257097] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2289.257320] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2289.257509] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Deleting the datastore file [datastore1] c4fe8b3e-cee1-401b-a26f-907a8de95eba {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2289.257787] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e254cf1c-a00d-4c4b-8178-156ad20b5555 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.263366] env[67899]: DEBUG oslo_vmware.api [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Waiting for the task: (returnval){ [ 2289.263366] env[67899]: value = "task-3468050" [ 2289.263366] env[67899]: _type = "Task" [ 2289.263366] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2289.270664] env[67899]: DEBUG oslo_vmware.api [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Task: {'id': task-3468050, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2289.646939] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2289.647294] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating directory with path [datastore1] vmware_temp/194fad56-1adc-40f4-93ce-0d189e33ca9b/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2289.647438] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb5f802e-a1f2-4c55-87e7-8893b9bc2466 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.658420] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Created directory with path [datastore1] vmware_temp/194fad56-1adc-40f4-93ce-0d189e33ca9b/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2289.658550] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Fetch image to [datastore1] vmware_temp/194fad56-1adc-40f4-93ce-0d189e33ca9b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2289.658711] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/194fad56-1adc-40f4-93ce-0d189e33ca9b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2289.659411] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6d59685-8e6f-44e7-b5da-bcd01ce4365a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.665582] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-373fa9e3-25d1-4c9a-954b-0e55b3560dbf {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.674027] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54a6fe75-a731-4f35-b8b8-9f66f718deb5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.704403] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1117e1c3-6c30-4d05-8c1d-b2f44ae1b8ad {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.709937] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d6f2822e-0b49-45d0-9d17-6867e2d74d44 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.729611] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2289.772960] env[67899]: DEBUG oslo_vmware.api [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Task: {'id': task-3468050, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.039186} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2289.773230] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2289.773410] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2289.773580] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2289.773750] env[67899]: INFO nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Took 0.56 seconds to destroy the instance on the hypervisor. [ 2289.773980] env[67899]: DEBUG oslo.service.loopingcall [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2289.774194] env[67899]: DEBUG nova.compute.manager [-] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Skipping network deallocation for instance since networking was not requested. {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2289.776881] env[67899]: DEBUG oslo_vmware.rw_handles [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/194fad56-1adc-40f4-93ce-0d189e33ca9b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2289.778481] env[67899]: DEBUG nova.compute.claims [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2289.778649] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2289.778864] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2289.840815] env[67899]: DEBUG oslo_vmware.rw_handles [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2289.841307] env[67899]: DEBUG oslo_vmware.rw_handles [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/194fad56-1adc-40f4-93ce-0d189e33ca9b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2289.939207] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10c78e1b-0426-4a54-ba94-76f6662f9480 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.946263] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4157fb78-cb02-4085-8cbf-a30a5eff6e85 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.975108] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b7e789d-11e5-4a20-87d1-e63f9d2ce2d1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.982086] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97d63ec6-b3ef-49d2-9ea5-3debe0697906 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.996048] env[67899]: DEBUG nova.compute.provider_tree [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2290.004496] env[67899]: DEBUG nova.scheduler.client.report [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2290.020505] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.242s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2290.021063] env[67899]: ERROR nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2290.021063] env[67899]: Faults: ['InvalidArgument'] [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Traceback (most recent call last): [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] self.driver.spawn(context, instance, image_meta, [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] self._fetch_image_if_missing(context, vi) [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] image_cache(vi, tmp_image_ds_loc) [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] vm_util.copy_virtual_disk( [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] session._wait_for_task(vmdk_copy_task) [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] return self.wait_for_task(task_ref) [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] return evt.wait() [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] result = hub.switch() [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] return self.greenlet.switch() [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] self.f(*self.args, **self.kw) [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] raise exceptions.translate_fault(task_info.error) [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Faults: ['InvalidArgument'] [ 2290.021063] env[67899]: ERROR nova.compute.manager [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] [ 2290.021894] env[67899]: DEBUG nova.compute.utils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2290.023108] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Build of instance c4fe8b3e-cee1-401b-a26f-907a8de95eba was re-scheduled: A specified parameter was not correct: fileType [ 2290.023108] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2290.023476] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2290.023696] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "refresh_cache-c4fe8b3e-cee1-401b-a26f-907a8de95eba" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2290.023839] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquired lock "refresh_cache-c4fe8b3e-cee1-401b-a26f-907a8de95eba" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2290.023995] env[67899]: DEBUG nova.network.neutron [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2290.048554] env[67899]: DEBUG nova.network.neutron [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2290.107549] env[67899]: DEBUG nova.network.neutron [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2290.116176] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Releasing lock "refresh_cache-c4fe8b3e-cee1-401b-a26f-907a8de95eba" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2290.116398] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2290.116638] env[67899]: DEBUG nova.compute.manager [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Skipping network deallocation for instance since networking was not requested. {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2290.207291] env[67899]: INFO nova.scheduler.client.report [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Deleted allocations for instance c4fe8b3e-cee1-401b-a26f-907a8de95eba [ 2290.228205] env[67899]: DEBUG oslo_concurrency.lockutils [None req-027881f9-9f05-4771-b0a4-099e769baf40 tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "c4fe8b3e-cee1-401b-a26f-907a8de95eba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 455.722s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2290.228476] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "c4fe8b3e-cee1-401b-a26f-907a8de95eba" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 259.252s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2290.228696] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "c4fe8b3e-cee1-401b-a26f-907a8de95eba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2290.228927] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "c4fe8b3e-cee1-401b-a26f-907a8de95eba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2290.229127] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "c4fe8b3e-cee1-401b-a26f-907a8de95eba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2290.231099] env[67899]: INFO nova.compute.manager [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Terminating instance [ 2290.232461] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquiring lock "refresh_cache-c4fe8b3e-cee1-401b-a26f-907a8de95eba" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2290.232616] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Acquired lock "refresh_cache-c4fe8b3e-cee1-401b-a26f-907a8de95eba" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2290.232783] env[67899]: DEBUG nova.network.neutron [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2290.256612] env[67899]: DEBUG nova.network.neutron [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2290.340108] env[67899]: DEBUG nova.network.neutron [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2290.348972] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Releasing lock "refresh_cache-c4fe8b3e-cee1-401b-a26f-907a8de95eba" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2290.349381] env[67899]: DEBUG nova.compute.manager [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2290.349573] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2290.350093] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-55833e1c-4b90-4a89-9d37-3e1235f5f7d9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.359171] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c351663-16ee-4454-8feb-a8b881ff1efa {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2290.386586] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c4fe8b3e-cee1-401b-a26f-907a8de95eba could not be found. [ 2290.386863] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2290.387080] env[67899]: INFO nova.compute.manager [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2290.387320] env[67899]: DEBUG oslo.service.loopingcall [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2290.387586] env[67899]: DEBUG nova.compute.manager [-] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2290.387711] env[67899]: DEBUG nova.network.neutron [-] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2290.403053] env[67899]: DEBUG nova.network.neutron [-] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2290.410465] env[67899]: DEBUG nova.network.neutron [-] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2290.418841] env[67899]: INFO nova.compute.manager [-] [instance: c4fe8b3e-cee1-401b-a26f-907a8de95eba] Took 0.03 seconds to deallocate network for instance. [ 2290.500393] env[67899]: DEBUG oslo_concurrency.lockutils [None req-0c16dcf4-0905-4c8f-a290-d4a8b41ceb6a tempest-ServersAaction247Test-759872082 tempest-ServersAaction247Test-759872082-project-member] Lock "c4fe8b3e-cee1-401b-a26f-907a8de95eba" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.272s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.010248] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2317.998061] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2318.996652] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2320.996881] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2321.997652] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2322.009669] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2322.009883] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2322.010058] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2322.010227] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2322.011336] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66b155dd-c3ef-4aa0-b6a4-8cba1ca10830 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2322.020015] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f43f09b0-4613-4b22-9e39-63fd007ffdad {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2322.033718] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31e8a078-4163-4257-9f8c-958cf539e07b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2322.039914] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5b376f0-7e85-44d6-a1a9-cd89c053447f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2322.067737] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180931MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2322.067883] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2322.068077] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2322.129184] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance a993c6a9-140f-430d-a77e-98c2567bf7af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2322.129351] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2322.129483] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 483824d1-4994-436a-ba16-12524684405c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2322.129602] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cc1164c7-82bb-4d80-89ad-e9ba5658d9c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2322.129729] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance fc98bb10-8fe8-4203-80b8-9885b2c302c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2322.129880] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2322.130009] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 43854021-a115-4460-870a-d7332c62b758 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2322.130194] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2322.130333] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2322.214383] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12de6fbb-da20-4068-8630-380873c269bd {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2322.222051] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c03c995a-4a3f-4f61-bbed-ba630d69d71c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2322.251747] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9906883c-7af0-4c90-8cf1-48de8109d0b1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2322.258457] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c88c9cb-8603-40d7-b84f-fe5b549db2af {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2322.271323] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2322.279348] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2322.298312] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2322.298503] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2323.297777] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2323.298160] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2323.298160] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2323.314572] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2323.314728] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2323.314847] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 483824d1-4994-436a-ba16-12524684405c] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2323.314971] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2323.315112] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2323.315269] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2323.315392] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 43854021-a115-4460-870a-d7332c62b758] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2323.315507] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2323.315963] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2324.996617] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2326.996789] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2326.996789] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2339.465923] env[67899]: WARNING oslo_vmware.rw_handles [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2339.465923] env[67899]: ERROR oslo_vmware.rw_handles [ 2339.466635] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/194fad56-1adc-40f4-93ce-0d189e33ca9b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2339.468496] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2339.468732] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Copying Virtual Disk [datastore1] vmware_temp/194fad56-1adc-40f4-93ce-0d189e33ca9b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/194fad56-1adc-40f4-93ce-0d189e33ca9b/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2339.469017] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8a4ccffd-227d-497e-941a-ca2065172d0c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2339.477578] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 2339.477578] env[67899]: value = "task-3468051" [ 2339.477578] env[67899]: _type = "Task" [ 2339.477578] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2339.485503] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': task-3468051, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2339.989051] env[67899]: DEBUG oslo_vmware.exceptions [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2339.989051] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2339.989284] env[67899]: ERROR nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2339.989284] env[67899]: Faults: ['InvalidArgument'] [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Traceback (most recent call last): [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] yield resources [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] self.driver.spawn(context, instance, image_meta, [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] self._fetch_image_if_missing(context, vi) [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] image_cache(vi, tmp_image_ds_loc) [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] vm_util.copy_virtual_disk( [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] session._wait_for_task(vmdk_copy_task) [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] return self.wait_for_task(task_ref) [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] return evt.wait() [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] result = hub.switch() [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] return self.greenlet.switch() [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] self.f(*self.args, **self.kw) [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] raise exceptions.translate_fault(task_info.error) [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Faults: ['InvalidArgument'] [ 2339.989284] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] [ 2339.990231] env[67899]: INFO nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Terminating instance [ 2339.990992] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2339.991216] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2339.991453] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bc39c985-050c-4544-aaa9-69977902470b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2339.993773] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2339.994011] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2339.994706] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af3a06fa-f706-410f-afc7-26b390535606 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.001092] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2340.001295] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0a584924-53d9-4f81-8cd2-62706ed3965b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.003399] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2340.003597] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2340.004490] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3094e673-2a0c-4382-a556-2f9898d08f1f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.009427] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 2340.009427] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d5de60-781b-e1a4-6d46-6f5b037196b3" [ 2340.009427] env[67899]: _type = "Task" [ 2340.009427] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2340.017546] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52d5de60-781b-e1a4-6d46-6f5b037196b3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2340.077897] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2340.078122] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2340.078305] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Deleting the datastore file [datastore1] a993c6a9-140f-430d-a77e-98c2567bf7af {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2340.078608] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-93f71fac-7561-49d2-97c5-3ed0e93cc0f2 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.084771] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 2340.084771] env[67899]: value = "task-3468053" [ 2340.084771] env[67899]: _type = "Task" [ 2340.084771] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2340.093317] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': task-3468053, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2340.520940] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2340.521229] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating directory with path [datastore1] vmware_temp/b4c13b2e-5fa0-4d2f-8daf-f65c125823bc/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2340.521457] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c0f79d07-7449-4c89-b0a7-225244a369ae {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.532652] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Created directory with path [datastore1] vmware_temp/b4c13b2e-5fa0-4d2f-8daf-f65c125823bc/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2340.532847] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Fetch image to [datastore1] vmware_temp/b4c13b2e-5fa0-4d2f-8daf-f65c125823bc/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2340.533029] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/b4c13b2e-5fa0-4d2f-8daf-f65c125823bc/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2340.533742] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a0a7f8d-b473-4dd2-b072-ec87ddf7eb0a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.540508] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b70ad2b7-306f-4406-afc4-1446e2c619e0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.549332] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b017fa4-720a-4e8c-82d8-735a552c315b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.578967] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-066b6e81-c517-455f-9504-d464e0b435e9 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.584522] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-207053ad-95f2-4fd7-b245-3462ae53a791 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.594620] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': task-3468053, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074334} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2340.594854] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2340.595040] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2340.595210] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2340.595380] env[67899]: INFO nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2340.597414] env[67899]: DEBUG nova.compute.claims [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2340.597610] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2340.597828] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2340.610389] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2340.660855] env[67899]: DEBUG oslo_vmware.rw_handles [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b4c13b2e-5fa0-4d2f-8daf-f65c125823bc/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2340.721573] env[67899]: DEBUG oslo_vmware.rw_handles [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2340.721873] env[67899]: DEBUG oslo_vmware.rw_handles [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b4c13b2e-5fa0-4d2f-8daf-f65c125823bc/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2340.785083] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f63f93fa-c573-4372-bbf6-348a4f33c658 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.792532] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5766829-9a02-449d-beb2-7aa19e6f71d8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.823461] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c012b3a-8427-4e8c-a194-f1b00c22faa5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.830439] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b1a1cca-b896-4fc8-9aa6-ce2790dceda8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.843171] env[67899]: DEBUG nova.compute.provider_tree [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2340.852444] env[67899]: DEBUG nova.scheduler.client.report [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2340.867715] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.270s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2340.868209] env[67899]: ERROR nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2340.868209] env[67899]: Faults: ['InvalidArgument'] [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Traceback (most recent call last): [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] self.driver.spawn(context, instance, image_meta, [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] self._fetch_image_if_missing(context, vi) [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] image_cache(vi, tmp_image_ds_loc) [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] vm_util.copy_virtual_disk( [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] session._wait_for_task(vmdk_copy_task) [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] return self.wait_for_task(task_ref) [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] return evt.wait() [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] result = hub.switch() [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] return self.greenlet.switch() [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] self.f(*self.args, **self.kw) [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] raise exceptions.translate_fault(task_info.error) [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Faults: ['InvalidArgument'] [ 2340.868209] env[67899]: ERROR nova.compute.manager [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] [ 2340.869080] env[67899]: DEBUG nova.compute.utils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2340.870245] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Build of instance a993c6a9-140f-430d-a77e-98c2567bf7af was re-scheduled: A specified parameter was not correct: fileType [ 2340.870245] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2340.870611] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2340.870782] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2340.870948] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2340.871129] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2341.131222] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2341.146365] env[67899]: INFO nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Took 0.28 seconds to deallocate network for instance. [ 2341.236082] env[67899]: INFO nova.scheduler.client.report [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Deleted allocations for instance a993c6a9-140f-430d-a77e-98c2567bf7af [ 2341.256860] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a993c6a9-140f-430d-a77e-98c2567bf7af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 500.831s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2341.256860] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a993c6a9-140f-430d-a77e-98c2567bf7af" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 304.830s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2341.257122] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "a993c6a9-140f-430d-a77e-98c2567bf7af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2341.257348] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a993c6a9-140f-430d-a77e-98c2567bf7af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2341.257556] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a993c6a9-140f-430d-a77e-98c2567bf7af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2341.259839] env[67899]: INFO nova.compute.manager [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Terminating instance [ 2341.261523] env[67899]: DEBUG nova.compute.manager [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2341.261721] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2341.262225] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7efe9607-a584-419f-9c69-78888af0e42d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.271132] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45385058-17bf-4c54-acbd-418861bd49b7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2341.299558] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a993c6a9-140f-430d-a77e-98c2567bf7af could not be found. [ 2341.299764] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2341.299937] env[67899]: INFO nova.compute.manager [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2341.300185] env[67899]: DEBUG oslo.service.loopingcall [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2341.300401] env[67899]: DEBUG nova.compute.manager [-] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2341.300499] env[67899]: DEBUG nova.network.neutron [-] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2341.324589] env[67899]: DEBUG nova.network.neutron [-] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2341.333025] env[67899]: INFO nova.compute.manager [-] [instance: a993c6a9-140f-430d-a77e-98c2567bf7af] Took 0.03 seconds to deallocate network for instance. [ 2341.412875] env[67899]: DEBUG oslo_concurrency.lockutils [None req-e15bdb85-7f62-4363-b088-4abbe3ce4afd tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "a993c6a9-140f-430d-a77e-98c2567bf7af" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.156s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2372.998570] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2374.990138] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2374.990505] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Getting list of instances from cluster (obj){ [ 2374.990505] env[67899]: value = "domain-c8" [ 2374.990505] env[67899]: _type = "ClusterComputeResource" [ 2374.990505] env[67899]: } {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2374.991498] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca8ae274-8f4b-4afd-bff5-688c5adb4448 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2375.005465] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Got total of 6 instances {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2376.023716] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2378.998139] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2379.997362] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2381.996803] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2382.997365] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2382.997707] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2382.997810] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2383.008525] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] There are 0 instances to clean {{(pid=67899) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2384.007899] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2384.008307] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2384.008307] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2384.027160] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2384.027255] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 483824d1-4994-436a-ba16-12524684405c] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2384.027384] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2384.027511] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2384.027629] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2384.027746] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 43854021-a115-4460-870a-d7332c62b758] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2384.027863] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2384.028434] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2384.038831] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2384.039043] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2384.039216] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2384.039368] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2384.040419] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c173f1aa-41db-4c85-948f-8d548ad8d828 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.048679] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c816892-a133-4916-bffc-83166e0d3a5a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.063664] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd80cd09-85cf-4174-9c39-2aca0bca5616 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.069892] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a55a2d6f-f361-4df6-8caa-24173b4623d6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.098086] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180880MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2384.098294] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2384.098415] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2384.240149] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2384.240399] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 483824d1-4994-436a-ba16-12524684405c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2384.240571] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cc1164c7-82bb-4d80-89ad-e9ba5658d9c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2384.240658] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance fc98bb10-8fe8-4203-80b8-9885b2c302c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2384.240766] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2384.240882] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 43854021-a115-4460-870a-d7332c62b758 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2384.241083] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2384.241225] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2384.256357] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing inventories for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2384.269488] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating ProviderTree inventory for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2384.269488] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Updating inventory in ProviderTree for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2384.280232] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing aggregate associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, aggregates: None {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2384.297378] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Refreshing trait associations for resource provider fffa0b42-f65d-4394-a98c-0df038b9ed4b, traits: COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67899) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2384.367772] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f15d9295-67ec-4e7e-9a04-54f0ce41bd57 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.375245] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b8575ce-d17e-4b95-b768-dfc2e2a0adf1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.405139] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44010809-c194-4f9b-9c7a-8dd2ec5c9864 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.411616] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-283aa5a4-d3d3-41a0-8cae-4f3580fc8020 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2384.424225] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2384.432644] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2384.444704] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2384.444867] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.346s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2386.412677] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2387.470199] env[67899]: WARNING oslo_vmware.rw_handles [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2387.470199] env[67899]: ERROR oslo_vmware.rw_handles [ 2387.470872] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/b4c13b2e-5fa0-4d2f-8daf-f65c125823bc/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2387.472730] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2387.472981] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Copying Virtual Disk [datastore1] vmware_temp/b4c13b2e-5fa0-4d2f-8daf-f65c125823bc/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/b4c13b2e-5fa0-4d2f-8daf-f65c125823bc/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2387.473287] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-914c1788-a2f3-4951-8f5d-52df6070c37b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.481285] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 2387.481285] env[67899]: value = "task-3468054" [ 2387.481285] env[67899]: _type = "Task" [ 2387.481285] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2387.489299] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': task-3468054, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2387.992463] env[67899]: DEBUG oslo_vmware.exceptions [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2387.992744] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2387.993312] env[67899]: ERROR nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2387.993312] env[67899]: Faults: ['InvalidArgument'] [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Traceback (most recent call last): [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] yield resources [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] self.driver.spawn(context, instance, image_meta, [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] self._fetch_image_if_missing(context, vi) [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] image_cache(vi, tmp_image_ds_loc) [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] vm_util.copy_virtual_disk( [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] session._wait_for_task(vmdk_copy_task) [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] return self.wait_for_task(task_ref) [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] return evt.wait() [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] result = hub.switch() [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] return self.greenlet.switch() [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] self.f(*self.args, **self.kw) [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] raise exceptions.translate_fault(task_info.error) [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Faults: ['InvalidArgument'] [ 2387.993312] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] [ 2387.994186] env[67899]: INFO nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Terminating instance [ 2387.995218] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2387.995432] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2387.995672] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-62c4a10f-3933-496f-b392-27a55d38d047 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2387.997916] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2387.998149] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2387.998904] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fcb97e0-320e-44d4-be81-1251c5fad953 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.006242] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2388.007256] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c9285214-14df-4e62-b81f-05a7b7b29413 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.008688] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2388.008852] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2388.009563] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-887f9e26-93ef-4380-acfd-45b19f7b4e4b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.014473] env[67899]: DEBUG oslo_vmware.api [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Waiting for the task: (returnval){ [ 2388.014473] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5204c7bd-0605-e038-22fc-d1071d1287d5" [ 2388.014473] env[67899]: _type = "Task" [ 2388.014473] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2388.021553] env[67899]: DEBUG oslo_vmware.api [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5204c7bd-0605-e038-22fc-d1071d1287d5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2388.077545] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2388.077762] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2388.077940] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Deleting the datastore file [datastore1] c17d88cf-69ba-43e9-a672-24503c65e9f2 {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2388.078316] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a37d9d9d-30a3-436c-b9b5-d32c04d790d3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.085246] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for the task: (returnval){ [ 2388.085246] env[67899]: value = "task-3468056" [ 2388.085246] env[67899]: _type = "Task" [ 2388.085246] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2388.093182] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': task-3468056, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2388.525117] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2388.525417] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Creating directory with path [datastore1] vmware_temp/a48a047b-52c1-4067-82ed-fb91a06c20f9/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2388.525611] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a55725b1-ec41-4c82-ac5f-53726e3fb7ed {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.537476] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Created directory with path [datastore1] vmware_temp/a48a047b-52c1-4067-82ed-fb91a06c20f9/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2388.537668] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Fetch image to [datastore1] vmware_temp/a48a047b-52c1-4067-82ed-fb91a06c20f9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2388.537851] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/a48a047b-52c1-4067-82ed-fb91a06c20f9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2388.538619] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-112736a4-3aa6-4e01-9422-af2a089a50aa {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.545364] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37c3d8ca-7acd-4db1-9d24-cb0e561f2ee4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.554606] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbcbff24-1bb9-4c85-adc2-3d4d92afdb7b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.585063] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1eee6ad-8d91-4a6b-a9c0-a10b1c29bb5d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.595039] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8a0bc5d9-7fb2-45d8-8cb7-b3d24b591e0a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.596632] env[67899]: DEBUG oslo_vmware.api [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Task: {'id': task-3468056, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065161} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2388.596861] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2388.597048] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2388.597220] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2388.597387] env[67899]: INFO nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2388.599407] env[67899]: DEBUG nova.compute.claims [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2388.599574] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2388.599780] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2388.617873] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2388.673523] env[67899]: DEBUG oslo_vmware.rw_handles [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a48a047b-52c1-4067-82ed-fb91a06c20f9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2388.732622] env[67899]: DEBUG oslo_vmware.rw_handles [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2388.732828] env[67899]: DEBUG oslo_vmware.rw_handles [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a48a047b-52c1-4067-82ed-fb91a06c20f9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2388.773506] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2a3d84b-edf8-4490-a78c-dd741c12d7ca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.781222] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b054c9a-8fca-478b-a9ad-553aa2b59883 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.810520] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f37f346-9604-4036-8fd4-dbe80cfb273d {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.817165] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcb65fd3-9627-4d5b-a76e-62163807f00e {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2388.831109] env[67899]: DEBUG nova.compute.provider_tree [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2388.839189] env[67899]: DEBUG nova.scheduler.client.report [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2388.854630] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.255s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2388.855168] env[67899]: ERROR nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2388.855168] env[67899]: Faults: ['InvalidArgument'] [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Traceback (most recent call last): [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] self.driver.spawn(context, instance, image_meta, [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] self._fetch_image_if_missing(context, vi) [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] image_cache(vi, tmp_image_ds_loc) [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] vm_util.copy_virtual_disk( [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] session._wait_for_task(vmdk_copy_task) [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] return self.wait_for_task(task_ref) [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] return evt.wait() [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] result = hub.switch() [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] return self.greenlet.switch() [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] self.f(*self.args, **self.kw) [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] raise exceptions.translate_fault(task_info.error) [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Faults: ['InvalidArgument'] [ 2388.855168] env[67899]: ERROR nova.compute.manager [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] [ 2388.855988] env[67899]: DEBUG nova.compute.utils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2388.857271] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Build of instance c17d88cf-69ba-43e9-a672-24503c65e9f2 was re-scheduled: A specified parameter was not correct: fileType [ 2388.857271] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2388.857639] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2388.857808] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2388.857972] env[67899]: DEBUG nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2388.858176] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2388.995836] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2388.996016] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2389.224027] env[67899]: DEBUG nova.network.neutron [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2389.238454] env[67899]: INFO nova.compute.manager [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Took 0.38 seconds to deallocate network for instance. [ 2389.337483] env[67899]: INFO nova.scheduler.client.report [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Deleted allocations for instance c17d88cf-69ba-43e9-a672-24503c65e9f2 [ 2389.360464] env[67899]: DEBUG oslo_concurrency.lockutils [None req-21b1da0d-76fd-4b17-ae99-dd020b2b26d7 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "c17d88cf-69ba-43e9-a672-24503c65e9f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 548.909s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2389.360773] env[67899]: DEBUG oslo_concurrency.lockutils [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "c17d88cf-69ba-43e9-a672-24503c65e9f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 352.995s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2389.360999] env[67899]: DEBUG oslo_concurrency.lockutils [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Acquiring lock "c17d88cf-69ba-43e9-a672-24503c65e9f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2389.361220] env[67899]: DEBUG oslo_concurrency.lockutils [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "c17d88cf-69ba-43e9-a672-24503c65e9f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2389.361388] env[67899]: DEBUG oslo_concurrency.lockutils [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "c17d88cf-69ba-43e9-a672-24503c65e9f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2389.363411] env[67899]: INFO nova.compute.manager [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Terminating instance [ 2389.365033] env[67899]: DEBUG nova.compute.manager [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2389.365230] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2389.365737] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9c2ba986-46c7-4f75-8c6c-46b6ae8910da {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2389.374522] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecea2f84-7c80-45d9-b8ec-d83bf28081d0 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2389.400334] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c17d88cf-69ba-43e9-a672-24503c65e9f2 could not be found. [ 2389.400522] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2389.400697] env[67899]: INFO nova.compute.manager [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2389.400931] env[67899]: DEBUG oslo.service.loopingcall [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2389.401383] env[67899]: DEBUG nova.compute.manager [-] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2389.401482] env[67899]: DEBUG nova.network.neutron [-] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2389.425236] env[67899]: DEBUG nova.network.neutron [-] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2389.433031] env[67899]: INFO nova.compute.manager [-] [instance: c17d88cf-69ba-43e9-a672-24503c65e9f2] Took 0.03 seconds to deallocate network for instance. [ 2389.518858] env[67899]: DEBUG oslo_concurrency.lockutils [None req-52768baa-1054-42de-be24-1048e581ca75 tempest-MultipleCreateTestJSON-1767800187 tempest-MultipleCreateTestJSON-1767800187-project-member] Lock "c17d88cf-69ba-43e9-a672-24503c65e9f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.158s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2392.993446] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2400.960319] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._sync_power_states {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2400.976145] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Getting list of instances from cluster (obj){ [ 2400.976145] env[67899]: value = "domain-c8" [ 2400.976145] env[67899]: _type = "ClusterComputeResource" [ 2400.976145] env[67899]: } {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2400.977398] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a91310f-d750-4803-848e-57d6fde24e04 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2401.657609] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Got total of 5 instances {{(pid=67899) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2401.657774] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 483824d1-4994-436a-ba16-12524684405c {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2401.657961] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid cc1164c7-82bb-4d80-89ad-e9ba5658d9c8 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2401.658139] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid fc98bb10-8fe8-4203-80b8-9885b2c302c1 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2401.658333] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2401.658505] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Triggering sync for uuid 43854021-a115-4460-870a-d7332c62b758 {{(pid=67899) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2401.658802] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "483824d1-4994-436a-ba16-12524684405c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2401.659035] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "cc1164c7-82bb-4d80-89ad-e9ba5658d9c8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2401.659262] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "fc98bb10-8fe8-4203-80b8-9885b2c302c1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2401.659472] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2401.659666] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "43854021-a115-4460-870a-d7332c62b758" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2403.996411] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2403.996737] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Cleaning up deleted instances with incomplete migration {{(pid=67899) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2411.637544] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "03a5b06c-0da1-4d30-9656-5cd9f98023b4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2411.637848] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Lock "03a5b06c-0da1-4d30-9656-5cd9f98023b4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2411.648499] env[67899]: DEBUG nova.compute.manager [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Starting instance... {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2411.701805] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2411.702075] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2411.703622] env[67899]: INFO nova.compute.claims [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2411.827461] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc21b123-932e-4ade-94f1-73588e127cff {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.835137] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5b28c17-37eb-40c8-b3fd-8d115c343768 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.866012] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5edc6f14-1781-41ce-98b0-cde79e904343 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.873048] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bd860d4-4204-4c4b-b513-89743ac10c75 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.885986] env[67899]: DEBUG nova.compute.provider_tree [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2411.894484] env[67899]: DEBUG nova.scheduler.client.report [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2411.908204] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2411.908703] env[67899]: DEBUG nova.compute.manager [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Start building networks asynchronously for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2411.942737] env[67899]: DEBUG nova.compute.utils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Using /dev/sd instead of None {{(pid=67899) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2411.943932] env[67899]: DEBUG nova.compute.manager [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Allocating IP information in the background. {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2411.944478] env[67899]: DEBUG nova.network.neutron [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] allocate_for_instance() {{(pid=67899) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2411.954470] env[67899]: DEBUG nova.compute.manager [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Start building block device mappings for instance. {{(pid=67899) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2412.012506] env[67899]: DEBUG nova.policy [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b4ef04cb3bbd4881a51534fe50b18a95', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b1d299af0314e7e87698f444649de1c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67899) authorize /opt/stack/nova/nova/policy.py:203}} [ 2412.018569] env[67899]: DEBUG nova.compute.manager [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Start spawning the instance on the hypervisor. {{(pid=67899) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2412.045171] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T22:07:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T22:07:13Z,direct_url=,disk_format='vmdk',id=c655a05a-4a40-4b3f-b609-3ba8116ad90f,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c1aaa2970e964d7b86557399120d12c1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T22:07:14Z,virtual_size=,visibility=), allow threads: False {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2412.045405] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Flavor limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2412.045563] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Image limits 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2412.045827] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Flavor pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2412.045892] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Image pref 0:0:0 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2412.046026] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67899) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2412.046249] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2412.046449] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2412.046627] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Got 1 possible topologies {{(pid=67899) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2412.046789] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2412.046957] env[67899]: DEBUG nova.virt.hardware [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67899) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2412.047832] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98ba1267-df59-40a0-a730-3bf9d92b3c32 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2412.057862] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-505eba21-a497-41b2-a414-7c85ed988788 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2412.549163] env[67899]: DEBUG nova.network.neutron [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Successfully created port: 071f3547-8e9b-485c-893b-d5dcee0e88c3 {{(pid=67899) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2413.036647] env[67899]: DEBUG nova.compute.manager [req-9c2f06c7-3c8c-4392-90da-692aee3f9e42 req-c626fdf5-7b7d-4e2b-9ecc-9083bf40b70c service nova] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Received event network-vif-plugged-071f3547-8e9b-485c-893b-d5dcee0e88c3 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2413.036907] env[67899]: DEBUG oslo_concurrency.lockutils [req-9c2f06c7-3c8c-4392-90da-692aee3f9e42 req-c626fdf5-7b7d-4e2b-9ecc-9083bf40b70c service nova] Acquiring lock "03a5b06c-0da1-4d30-9656-5cd9f98023b4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2413.037081] env[67899]: DEBUG oslo_concurrency.lockutils [req-9c2f06c7-3c8c-4392-90da-692aee3f9e42 req-c626fdf5-7b7d-4e2b-9ecc-9083bf40b70c service nova] Lock "03a5b06c-0da1-4d30-9656-5cd9f98023b4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2413.037252] env[67899]: DEBUG oslo_concurrency.lockutils [req-9c2f06c7-3c8c-4392-90da-692aee3f9e42 req-c626fdf5-7b7d-4e2b-9ecc-9083bf40b70c service nova] Lock "03a5b06c-0da1-4d30-9656-5cd9f98023b4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2413.037413] env[67899]: DEBUG nova.compute.manager [req-9c2f06c7-3c8c-4392-90da-692aee3f9e42 req-c626fdf5-7b7d-4e2b-9ecc-9083bf40b70c service nova] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] No waiting events found dispatching network-vif-plugged-071f3547-8e9b-485c-893b-d5dcee0e88c3 {{(pid=67899) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2413.037574] env[67899]: WARNING nova.compute.manager [req-9c2f06c7-3c8c-4392-90da-692aee3f9e42 req-c626fdf5-7b7d-4e2b-9ecc-9083bf40b70c service nova] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Received unexpected event network-vif-plugged-071f3547-8e9b-485c-893b-d5dcee0e88c3 for instance with vm_state building and task_state spawning. [ 2413.112924] env[67899]: DEBUG nova.network.neutron [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Successfully updated port: 071f3547-8e9b-485c-893b-d5dcee0e88c3 {{(pid=67899) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2413.124421] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "refresh_cache-03a5b06c-0da1-4d30-9656-5cd9f98023b4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2413.124574] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquired lock "refresh_cache-03a5b06c-0da1-4d30-9656-5cd9f98023b4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2413.124920] env[67899]: DEBUG nova.network.neutron [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Building network info cache for instance {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2413.165218] env[67899]: DEBUG nova.network.neutron [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Instance cache missing network info. {{(pid=67899) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2413.316187] env[67899]: DEBUG nova.network.neutron [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Updating instance_info_cache with network_info: [{"id": "071f3547-8e9b-485c-893b-d5dcee0e88c3", "address": "fa:16:3e:73:ef:0b", "network": {"id": "98c45d1b-ff40-4055-be6f-b92512acc582", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1056955687-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b1d299af0314e7e87698f444649de1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae18b41f-e73c-44f1-83dd-467c080944f4", "external-id": "nsx-vlan-transportzone-653", "segmentation_id": 653, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap071f3547-8e", "ovs_interfaceid": "071f3547-8e9b-485c-893b-d5dcee0e88c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2413.326284] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Releasing lock "refresh_cache-03a5b06c-0da1-4d30-9656-5cd9f98023b4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2413.326577] env[67899]: DEBUG nova.compute.manager [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Instance network_info: |[{"id": "071f3547-8e9b-485c-893b-d5dcee0e88c3", "address": "fa:16:3e:73:ef:0b", "network": {"id": "98c45d1b-ff40-4055-be6f-b92512acc582", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1056955687-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b1d299af0314e7e87698f444649de1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae18b41f-e73c-44f1-83dd-467c080944f4", "external-id": "nsx-vlan-transportzone-653", "segmentation_id": 653, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap071f3547-8e", "ovs_interfaceid": "071f3547-8e9b-485c-893b-d5dcee0e88c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67899) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2413.327512] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:73:ef:0b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ae18b41f-e73c-44f1-83dd-467c080944f4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '071f3547-8e9b-485c-893b-d5dcee0e88c3', 'vif_model': 'vmxnet3'}] {{(pid=67899) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2413.334809] env[67899]: DEBUG oslo.service.loopingcall [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2413.335267] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Creating VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2413.335506] env[67899]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e5f265c1-8391-4589-8020-8aee878faaf3 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.356953] env[67899]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2413.356953] env[67899]: value = "task-3468057" [ 2413.356953] env[67899]: _type = "Task" [ 2413.356953] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2413.364568] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468057, 'name': CreateVM_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2413.867039] env[67899]: DEBUG oslo_vmware.api [-] Task: {'id': task-3468057, 'name': CreateVM_Task, 'duration_secs': 0.292714} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2413.867220] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Created VM on the ESX host {{(pid=67899) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2413.867860] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2413.868031] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2413.868374] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2413.868708] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e0967ed4-9536-4bda-9808-8c3c0d480786 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2413.873377] env[67899]: DEBUG oslo_vmware.api [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Waiting for the task: (returnval){ [ 2413.873377] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]5221ebaa-5bcf-a0d3-8177-0a7d673356d9" [ 2413.873377] env[67899]: _type = "Task" [ 2413.873377] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2413.880774] env[67899]: DEBUG oslo_vmware.api [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]5221ebaa-5bcf-a0d3-8177-0a7d673356d9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2414.383847] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2414.384249] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Processing image c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2414.384315] env[67899]: DEBUG oslo_concurrency.lockutils [None req-30657898-f26c-4afa-a8e3-5d23ec1ca480 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2415.068455] env[67899]: DEBUG nova.compute.manager [req-6827113d-72b1-49a4-8efc-ff8d5be2b5de req-64019479-5c59-4222-aebd-d512242fe835 service nova] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Received event network-changed-071f3547-8e9b-485c-893b-d5dcee0e88c3 {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2415.068674] env[67899]: DEBUG nova.compute.manager [req-6827113d-72b1-49a4-8efc-ff8d5be2b5de req-64019479-5c59-4222-aebd-d512242fe835 service nova] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Refreshing instance network info cache due to event network-changed-071f3547-8e9b-485c-893b-d5dcee0e88c3. {{(pid=67899) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2415.068892] env[67899]: DEBUG oslo_concurrency.lockutils [req-6827113d-72b1-49a4-8efc-ff8d5be2b5de req-64019479-5c59-4222-aebd-d512242fe835 service nova] Acquiring lock "refresh_cache-03a5b06c-0da1-4d30-9656-5cd9f98023b4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2415.069166] env[67899]: DEBUG oslo_concurrency.lockutils [req-6827113d-72b1-49a4-8efc-ff8d5be2b5de req-64019479-5c59-4222-aebd-d512242fe835 service nova] Acquired lock "refresh_cache-03a5b06c-0da1-4d30-9656-5cd9f98023b4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2415.069359] env[67899]: DEBUG nova.network.neutron [req-6827113d-72b1-49a4-8efc-ff8d5be2b5de req-64019479-5c59-4222-aebd-d512242fe835 service nova] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Refreshing network info cache for port 071f3547-8e9b-485c-893b-d5dcee0e88c3 {{(pid=67899) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2415.315689] env[67899]: DEBUG nova.network.neutron [req-6827113d-72b1-49a4-8efc-ff8d5be2b5de req-64019479-5c59-4222-aebd-d512242fe835 service nova] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Updated VIF entry in instance network info cache for port 071f3547-8e9b-485c-893b-d5dcee0e88c3. {{(pid=67899) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2415.316062] env[67899]: DEBUG nova.network.neutron [req-6827113d-72b1-49a4-8efc-ff8d5be2b5de req-64019479-5c59-4222-aebd-d512242fe835 service nova] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Updating instance_info_cache with network_info: [{"id": "071f3547-8e9b-485c-893b-d5dcee0e88c3", "address": "fa:16:3e:73:ef:0b", "network": {"id": "98c45d1b-ff40-4055-be6f-b92512acc582", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1056955687-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b1d299af0314e7e87698f444649de1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae18b41f-e73c-44f1-83dd-467c080944f4", "external-id": "nsx-vlan-transportzone-653", "segmentation_id": 653, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap071f3547-8e", "ovs_interfaceid": "071f3547-8e9b-485c-893b-d5dcee0e88c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2415.325347] env[67899]: DEBUG oslo_concurrency.lockutils [req-6827113d-72b1-49a4-8efc-ff8d5be2b5de req-64019479-5c59-4222-aebd-d512242fe835 service nova] Releasing lock "refresh_cache-03a5b06c-0da1-4d30-9656-5cd9f98023b4" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2418.198175] env[67899]: DEBUG oslo_concurrency.lockutils [None req-4b422d28-9d2e-4a43-b4f1-5f125f386385 tempest-DeleteServersTestJSON-151727962 tempest-DeleteServersTestJSON-151727962-project-member] Acquiring lock "4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2437.000225] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2438.644921] env[67899]: WARNING oslo_vmware.rw_handles [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles response.begin() [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2438.644921] env[67899]: ERROR oslo_vmware.rw_handles [ 2438.645900] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Downloaded image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to vmware_temp/a48a047b-52c1-4067-82ed-fb91a06c20f9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2438.647791] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Caching image {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2438.648060] env[67899]: DEBUG nova.virt.vmwareapi.vm_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Copying Virtual Disk [datastore1] vmware_temp/a48a047b-52c1-4067-82ed-fb91a06c20f9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk to [datastore1] vmware_temp/a48a047b-52c1-4067-82ed-fb91a06c20f9/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk {{(pid=67899) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2438.648345] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fa083672-68c9-4936-af99-9382db43d8d1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2438.655649] env[67899]: DEBUG oslo_vmware.api [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Waiting for the task: (returnval){ [ 2438.655649] env[67899]: value = "task-3468058" [ 2438.655649] env[67899]: _type = "Task" [ 2438.655649] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2438.664522] env[67899]: DEBUG oslo_vmware.api [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Task: {'id': task-3468058, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2438.996550] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2439.165643] env[67899]: DEBUG oslo_vmware.exceptions [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Fault InvalidArgument not matched. {{(pid=67899) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2439.166043] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Releasing lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2439.166617] env[67899]: ERROR nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2439.166617] env[67899]: Faults: ['InvalidArgument'] [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] Traceback (most recent call last): [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] yield resources [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] self.driver.spawn(context, instance, image_meta, [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] self._fetch_image_if_missing(context, vi) [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] image_cache(vi, tmp_image_ds_loc) [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] vm_util.copy_virtual_disk( [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] session._wait_for_task(vmdk_copy_task) [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] return self.wait_for_task(task_ref) [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] return evt.wait() [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] result = hub.switch() [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] return self.greenlet.switch() [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] self.f(*self.args, **self.kw) [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] raise exceptions.translate_fault(task_info.error) [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] Faults: ['InvalidArgument'] [ 2439.166617] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] [ 2439.167745] env[67899]: INFO nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Terminating instance [ 2439.168521] env[67899]: DEBUG oslo_concurrency.lockutils [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Acquired lock "[datastore1] devstack-image-cache_base/c655a05a-4a40-4b3f-b609-3ba8116ad90f/c655a05a-4a40-4b3f-b609-3ba8116ad90f.vmdk" {{(pid=67899) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2439.168775] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2439.169027] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1de2e05b-f76d-4722-9028-f352ba5e7507 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.171379] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2439.171568] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2439.172298] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fd071b3-7c9a-482b-a904-a35e9c1dd3a5 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.179082] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Unregistering the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2439.179283] env[67899]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4e06cec5-ff80-4d85-943b-c61bf47b9dd6 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.181377] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2439.181544] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67899) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2439.182440] env[67899]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a7d320d6-7972-41c8-ae35-73c6db70dba1 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.187056] env[67899]: DEBUG oslo_vmware.api [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Waiting for the task: (returnval){ [ 2439.187056] env[67899]: value = "session[5298859c-589e-10f3-4f70-bf47a7ea371b]52fb9a20-82bf-0572-28a5-01228ea5ef70" [ 2439.187056] env[67899]: _type = "Task" [ 2439.187056] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2439.194219] env[67899]: DEBUG oslo_vmware.api [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Task: {'id': session[5298859c-589e-10f3-4f70-bf47a7ea371b]52fb9a20-82bf-0572-28a5-01228ea5ef70, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2439.247422] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Unregistered the VM {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2439.247601] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Deleting contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2439.247747] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Deleting the datastore file [datastore1] 483824d1-4994-436a-ba16-12524684405c {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2439.248009] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b76b6fe6-fe8c-46db-8cfa-179272eeeaab {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.254219] env[67899]: DEBUG oslo_vmware.api [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Waiting for the task: (returnval){ [ 2439.254219] env[67899]: value = "task-3468060" [ 2439.254219] env[67899]: _type = "Task" [ 2439.254219] env[67899]: } to complete. {{(pid=67899) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2439.261877] env[67899]: DEBUG oslo_vmware.api [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Task: {'id': task-3468060, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2439.698487] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Preparing fetch location {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2439.698945] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Creating directory with path [datastore1] vmware_temp/8d403e3e-3b50-4de2-bf0a-3bc05e1cf4b1/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2439.698990] env[67899]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9f5293f8-89bf-4b26-861f-82ca1b817bed {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.709732] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Created directory with path [datastore1] vmware_temp/8d403e3e-3b50-4de2-bf0a-3bc05e1cf4b1/c655a05a-4a40-4b3f-b609-3ba8116ad90f {{(pid=67899) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2439.709938] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Fetch image to [datastore1] vmware_temp/8d403e3e-3b50-4de2-bf0a-3bc05e1cf4b1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk {{(pid=67899) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2439.710128] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to [datastore1] vmware_temp/8d403e3e-3b50-4de2-bf0a-3bc05e1cf4b1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk on the data store datastore1 {{(pid=67899) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2439.710844] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f85bbcf9-7abc-4a70-a19a-db5a4925b4aa {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.717267] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-802e578b-adc0-4a59-a3ae-3b95613b11a8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.726032] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4046591d-9a35-45d0-acd7-9c4e9c008ff7 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.758131] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-025d1118-fcfb-4195-8dac-f79ce3b0318c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.765492] env[67899]: DEBUG oslo_vmware.api [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Task: {'id': task-3468060, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074119} completed successfully. {{(pid=67899) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2439.766839] env[67899]: DEBUG nova.virt.vmwareapi.ds_util [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Deleted the datastore file {{(pid=67899) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2439.767030] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Deleted contents of the VM from datastore datastore1 {{(pid=67899) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2439.767202] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2439.767370] env[67899]: INFO nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2439.769104] env[67899]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2209a485-3a6e-487f-a14a-519630a496e8 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.770928] env[67899]: DEBUG nova.compute.claims [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Aborting claim: {{(pid=67899) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2439.771129] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2439.771345] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2439.793923] env[67899]: DEBUG nova.virt.vmwareapi.images [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Downloading image file data c655a05a-4a40-4b3f-b609-3ba8116ad90f to the data store datastore1 {{(pid=67899) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2439.844255] env[67899]: DEBUG oslo_vmware.rw_handles [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d403e3e-3b50-4de2-bf0a-3bc05e1cf4b1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2439.905578] env[67899]: DEBUG oslo_vmware.rw_handles [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Completed reading data from the image iterator. {{(pid=67899) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2439.905764] env[67899]: DEBUG oslo_vmware.rw_handles [None req-c0b846cb-0beb-469c-8d24-30f38e3ed0f9 tempest-AttachVolumeTestJSON-1730857508 tempest-AttachVolumeTestJSON-1730857508-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d403e3e-3b50-4de2-bf0a-3bc05e1cf4b1/c655a05a-4a40-4b3f-b609-3ba8116ad90f/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67899) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2439.957050] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89f136e3-242e-4db1-ad80-0db3ba99c3ca {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.964285] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32f95f57-6c84-4923-9aac-728fa7f127fb {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.992791] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-492499cb-e7e5-405a-9433-7c8e9709ce27 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.999662] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f5a48d-bb7d-4e8b-9d63-2044bd160492 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.012260] env[67899]: DEBUG nova.compute.provider_tree [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2440.020761] env[67899]: DEBUG nova.scheduler.client.report [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2440.034402] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.263s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2440.034921] env[67899]: ERROR nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2440.034921] env[67899]: Faults: ['InvalidArgument'] [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] Traceback (most recent call last): [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] self.driver.spawn(context, instance, image_meta, [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] self._fetch_image_if_missing(context, vi) [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] image_cache(vi, tmp_image_ds_loc) [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] vm_util.copy_virtual_disk( [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] session._wait_for_task(vmdk_copy_task) [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] return self.wait_for_task(task_ref) [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] return evt.wait() [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] result = hub.switch() [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] return self.greenlet.switch() [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] self.f(*self.args, **self.kw) [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] raise exceptions.translate_fault(task_info.error) [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] Faults: ['InvalidArgument'] [ 2440.034921] env[67899]: ERROR nova.compute.manager [instance: 483824d1-4994-436a-ba16-12524684405c] [ 2440.035689] env[67899]: DEBUG nova.compute.utils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] VimFaultException {{(pid=67899) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2440.036904] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Build of instance 483824d1-4994-436a-ba16-12524684405c was re-scheduled: A specified parameter was not correct: fileType [ 2440.036904] env[67899]: Faults: ['InvalidArgument'] {{(pid=67899) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2440.037288] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Unplugging VIFs for instance {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2440.037459] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67899) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2440.037626] env[67899]: DEBUG nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2440.037788] env[67899]: DEBUG nova.network.neutron [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2440.553988] env[67899]: DEBUG nova.network.neutron [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2440.569624] env[67899]: INFO nova.compute.manager [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Took 0.53 seconds to deallocate network for instance. [ 2440.665841] env[67899]: INFO nova.scheduler.client.report [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Deleted allocations for instance 483824d1-4994-436a-ba16-12524684405c [ 2440.687756] env[67899]: DEBUG oslo_concurrency.lockutils [None req-2895de4c-3d88-407d-8d74-0788e4b279f8 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "483824d1-4994-436a-ba16-12524684405c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 587.854s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2440.688058] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "483824d1-4994-436a-ba16-12524684405c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 392.052s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2440.688281] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "483824d1-4994-436a-ba16-12524684405c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2440.688482] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "483824d1-4994-436a-ba16-12524684405c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2440.688654] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "483824d1-4994-436a-ba16-12524684405c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2440.690805] env[67899]: INFO nova.compute.manager [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Terminating instance [ 2440.692692] env[67899]: DEBUG nova.compute.manager [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Start destroying the instance on the hypervisor. {{(pid=67899) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2440.692692] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Destroying instance {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2440.693276] env[67899]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-be43e246-d10f-414c-a5ac-7a850d44a056 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.703508] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33f65c24-a379-41fd-9621-1c099b110a3f {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.730100] env[67899]: WARNING nova.virt.vmwareapi.vmops [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 483824d1-4994-436a-ba16-12524684405c could not be found. [ 2440.730317] env[67899]: DEBUG nova.virt.vmwareapi.vmops [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Instance destroyed {{(pid=67899) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2440.730495] env[67899]: INFO nova.compute.manager [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] [instance: 483824d1-4994-436a-ba16-12524684405c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2440.730732] env[67899]: DEBUG oslo.service.loopingcall [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67899) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2440.731277] env[67899]: DEBUG nova.compute.manager [-] [instance: 483824d1-4994-436a-ba16-12524684405c] Deallocating network for instance {{(pid=67899) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2440.731378] env[67899]: DEBUG nova.network.neutron [-] [instance: 483824d1-4994-436a-ba16-12524684405c] deallocate_for_instance() {{(pid=67899) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2440.757107] env[67899]: DEBUG nova.network.neutron [-] [instance: 483824d1-4994-436a-ba16-12524684405c] Updating instance_info_cache with network_info: [] {{(pid=67899) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2440.767059] env[67899]: INFO nova.compute.manager [-] [instance: 483824d1-4994-436a-ba16-12524684405c] Took 0.04 seconds to deallocate network for instance. [ 2440.905624] env[67899]: DEBUG oslo_concurrency.lockutils [None req-7747f032-14d3-46a5-8fda-4a45b3283385 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Lock "483824d1-4994-436a-ba16-12524684405c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.217s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2440.906470] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "483824d1-4994-436a-ba16-12524684405c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 39.248s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2440.906688] env[67899]: INFO nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 483824d1-4994-436a-ba16-12524684405c] During sync_power_state the instance has a pending task (deleting). Skip. [ 2440.906878] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "483824d1-4994-436a-ba16-12524684405c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2441.808985] env[67899]: DEBUG oslo_concurrency.lockutils [None req-a47b6992-05d3-4344-a48a-b54551b93732 tempest-AttachInterfacesTestJSON-1171711766 tempest-AttachInterfacesTestJSON-1171711766-project-member] Acquiring lock "43854021-a115-4460-870a-d7332c62b758" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2441.996978] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2442.996248] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2443.996194] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2444.998644] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2444.998644] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Starting heal instance info cache {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2444.998644] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Rebuilding the list of instances to heal {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2445.011628] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: cc1164c7-82bb-4d80-89ad-e9ba5658d9c8] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2445.011788] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: fc98bb10-8fe8-4203-80b8-9885b2c302c1] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2445.011927] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2445.012098] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 43854021-a115-4460-870a-d7332c62b758] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2445.012190] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] [instance: 03a5b06c-0da1-4d30-9656-5cd9f98023b4] Skipping network cache update for instance because it is Building. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2445.012299] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Didn't find any instances for network info cache update. {{(pid=67899) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2445.996315] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2445.996586] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager.update_available_resource {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2446.007932] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2446.008192] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2446.008333] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2446.008488] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67899) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2446.009627] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d473a0f-76f1-4e25-8a03-8bbf2debafb4 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.018570] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39ea1870-9a04-422c-aa06-a772247bab5a {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.033118] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deba6533-b15c-4ae4-9561-55bdbc7aa17c {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.039167] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfe0b86d-cccf-40ed-8211-a2b1933ae182 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.067365] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180931MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67899) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2446.067516] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2446.067709] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2446.119054] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance cc1164c7-82bb-4d80-89ad-e9ba5658d9c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2446.119224] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance fc98bb10-8fe8-4203-80b8-9885b2c302c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2446.119352] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 4d6348dc-bbca-46e4-b5cf-8f6a4e4547cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2446.119473] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 43854021-a115-4460-870a-d7332c62b758 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2446.119590] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Instance 03a5b06c-0da1-4d30-9656-5cd9f98023b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67899) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2446.119768] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2446.119906] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=67899) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2446.184010] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d27f5512-2048-4524-8739-de3e4f317376 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.191312] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-587488b7-83c6-42d4-af2f-0191e785c065 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.221027] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11479183-f994-447b-a076-03df1d80e759 {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.227993] env[67899]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c464be0-17b7-4065-ab40-6cb480d4494b {{(pid=67899) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.241102] env[67899]: DEBUG nova.compute.provider_tree [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed in ProviderTree for provider: fffa0b42-f65d-4394-a98c-0df038b9ed4b {{(pid=67899) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2446.249337] env[67899]: DEBUG nova.scheduler.client.report [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Inventory has not changed for provider fffa0b42-f65d-4394-a98c-0df038b9ed4b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67899) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2446.262263] env[67899]: DEBUG nova.compute.resource_tracker [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67899) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2446.262442] env[67899]: DEBUG oslo_concurrency.lockutils [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s {{(pid=67899) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2450.262936] env[67899]: DEBUG oslo_service.periodic_task [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67899) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2450.263318] env[67899]: DEBUG nova.compute.manager [None req-3fc2e303-d226-4ea5-8e27-e8915713a2ad None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67899) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}}